Packages for data analysis¶

In [1]:
import pandas as pd
import numpy as np
from itertools import combinations

Packages for visualisation¶

In [2]:
import matplotlib.pyplot as plt
import seaborn as sns
%matplotlib inline
sns.set_theme(style="darkgrid")
# Others
import warnings
warnings.filterwarnings("ignore")

Rain in Australia¶

Description¶

This dataset contains daily weather observations from numerous Australian weather stations.

References¶

This total dataset is taken from following reference. https://www.kaggle.com/datasets/jsphyg/weather-dataset-rattle-package

Features description¶

Date---The date of observation

Location---The common name of the location of the weather station

MinTemp---The minimum temperature in degrees celsius

MaxTemp---The maximum temperature in degrees celsius

Rainfall---The amount of rainfall recorded for the day in mm

Evaporation---The so-called Class A pan evaporation (mm) in the 24 hours to 9am

Sunshine---The number of hours of bright sunshine in the day.

WindGustDir---The direction of the strongest wind gust in the 24 hours to midnight

WindGustSpeed---The speed (km/h) of the strongest wind gust in the 24 hours to midnight

WindDir9am---Direction of the wind at 9am

WindDir3pm---Direction of the wind at 3pm

WindSpeed9am---Wind speed (km/hr) averaged over 10 minutes prior to 9am

WindSpeed3pm---Wind speed (km/hr) averaged over 10 minutes prior to 3pm

Humidity9am---Humidity (percent) at 9am

Humidity3pm---Humidity (percent) at 3pm

Pressure9am---Atmospheric pressure (hpa) reduced to mean sea level at 9am

Pressure3pm---Atmospheric pressure (hpa) reduced to mean sea level at 3pm

Cloud9am---Fraction of sky obscured by cloud at 9am. This is measured in "oktas", which are a unit of eigths. It records how many eigths of the sky are obscured by cloud. A 0 measure indicates completely clear sky whilst an 8 indicates that it is completely overcast.

Cloud3pm---Fraction of sky obscured by cloud (in "oktas": eighths) at 3pm. See Cload9am for a description of the values

Temp9am---Temperature (degrees C) at 9am

Temp3pm---Temperature (degrees C) at 3pm

RainToday---Boolean: 1 if precipitation (mm) in the 24 hours to 9am exceeds 1mm, otherwise 0

RISK_MM---The amount of next day rain in mm. Used to create response variable RainTomorrow. A kind of measure of the "risk".

RainTomorrow---The target variable. Did it rain tomorrow?

In [3]:
df=pd.read_csv("weatherAUS.csv")
Original_Data=df
In [4]:
df['Date'] = pd.to_datetime(df['Date'])
#df['Location'] = pd.Categorical(df.Location)
In [5]:
df.head()
Out[5]:
Date Location MinTemp MaxTemp Rainfall Evaporation Sunshine WindGustDir WindGustSpeed WindDir9am ... Humidity9am Humidity3pm Pressure9am Pressure3pm Cloud9am Cloud3pm Temp9am Temp3pm RainToday RainTomorrow
0 2008-12-01 Albury 13.4 22.9 0.6 NaN NaN W 44.0 W ... 71.0 22.0 1007.7 1007.1 8.0 NaN 16.9 21.8 No No
1 2008-12-02 Albury 7.4 25.1 0.0 NaN NaN WNW 44.0 NNW ... 44.0 25.0 1010.6 1007.8 NaN NaN 17.2 24.3 No No
2 2008-12-03 Albury 12.9 25.7 0.0 NaN NaN WSW 46.0 W ... 38.0 30.0 1007.6 1008.7 NaN 2.0 21.0 23.2 No No
3 2008-12-04 Albury 9.2 28.0 0.0 NaN NaN NE 24.0 SE ... 45.0 16.0 1017.6 1012.8 NaN NaN 18.1 26.5 No No
4 2008-12-05 Albury 17.5 32.3 1.0 NaN NaN W 41.0 ENE ... 82.0 33.0 1010.8 1006.0 7.0 8.0 17.8 29.7 No No

5 rows × 23 columns

In [6]:
df.shape
Out[6]:
(145460, 23)
In [7]:
fig, ax = plt.subplots(figsize=(12,8))
mask = np.triu(np.ones_like(df.corr(), dtype=np.bool_))
sns.heatmap(df.corr(), annot=True, cmap="Blues", mask=mask, linewidth=0.5)
Out[7]:
<Axes: >

Comment¶

Several features have very strong correlation

MinTemp ~ Temp9am (corr = 90%)

MaxTemp ~ Temp3pm (corr = 98%)

Temp9am ~ Temp3pm (corr = 86%)

Pressure3pm ~ Pressure9am (corr = 96%)

We will drop: Temp9am, Temp3pm, Pressure9am

In [8]:
cols_to_drop = ['Temp9am', 'Temp3pm', 'Pressure9am']
df=df.drop(cols_to_drop, axis=1)
In [9]:
df_na_rm  =df
df_outliers__colna_rm  =df
df_na_mean_imp  =df    
df.shape
Out[9]:
(145460, 20)

Data inspection¶

Data types¶

In [10]:
df.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 145460 entries, 0 to 145459
Data columns (total 20 columns):
 #   Column         Non-Null Count   Dtype         
---  ------         --------------   -----         
 0   Date           145460 non-null  datetime64[ns]
 1   Location       145460 non-null  object        
 2   MinTemp        143975 non-null  float64       
 3   MaxTemp        144199 non-null  float64       
 4   Rainfall       142199 non-null  float64       
 5   Evaporation    82670 non-null   float64       
 6   Sunshine       75625 non-null   float64       
 7   WindGustDir    135134 non-null  object        
 8   WindGustSpeed  135197 non-null  float64       
 9   WindDir9am     134894 non-null  object        
 10  WindDir3pm     141232 non-null  object        
 11  WindSpeed9am   143693 non-null  float64       
 12  WindSpeed3pm   142398 non-null  float64       
 13  Humidity9am    142806 non-null  float64       
 14  Humidity3pm    140953 non-null  float64       
 15  Pressure3pm    130432 non-null  float64       
 16  Cloud9am       89572 non-null   float64       
 17  Cloud3pm       86102 non-null   float64       
 18  RainToday      142199 non-null  object        
 19  RainTomorrow   142193 non-null  object        
dtypes: datetime64[ns](1), float64(13), object(6)
memory usage: 22.2+ MB

Extract numerical features¶

In [11]:
num_cols=df.select_dtypes(include=np.number).columns.tolist()
print('There are', len(num_cols), 'numerical features, including:')
print(num_cols, "\n")
There are 13 numerical features, including:
['MinTemp', 'MaxTemp', 'Rainfall', 'Evaporation', 'Sunshine', 'WindGustSpeed', 'WindSpeed9am', 'WindSpeed3pm', 'Humidity9am', 'Humidity3pm', 'Pressure3pm', 'Cloud9am', 'Cloud3pm'] 

Extract categorical features¶

In [12]:
cat_cols=df.select_dtypes(object).columns.tolist()
print('There are', len(cat_cols), 'categorical features, including:')
print(cat_cols)
There are 6 categorical features, including:
['Location', 'WindGustDir', 'WindDir9am', 'WindDir3pm', 'RainToday', 'RainTomorrow']

Data quality¶

Checking data completeness¶

In [13]:
missing = pd.DataFrame(df.isnull().sum(), columns=['No. of missing values'])
missing['% missing_values'] = (missing/len(df)).round(2)*100
missing
Out[13]:
No. of missing values % missing_values
Date 0 0.0
Location 0 0.0
MinTemp 1485 1.0
MaxTemp 1261 1.0
Rainfall 3261 2.0
Evaporation 62790 43.0
Sunshine 69835 48.0
WindGustDir 10326 7.0
WindGustSpeed 10263 7.0
WindDir9am 10566 7.0
WindDir3pm 4228 3.0
WindSpeed9am 1767 1.0
WindSpeed3pm 3062 2.0
Humidity9am 2654 2.0
Humidity3pm 4507 3.0
Pressure3pm 15028 10.0
Cloud9am 55888 38.0
Cloud3pm 59358 41.0
RainToday 3261 2.0
RainTomorrow 3267 2.0

Remove rows where target variables are missing¶

In [14]:
df.dropna(how='all', subset=['RainTomorrow'],inplace=True) # Remove rows where target varible is missing
df.shape
Out[14]:
(142193, 20)

Where are missing values located in the dataset (white color indicates missing values)¶

In [15]:
import missingno as msno
msno.bar(df)
Out[15]:
<Axes: >
In [16]:
#plt.figure(figsize=(10,5))
#sns.heatmap(df.isnull(), cbar = False, cmap="viridis")
sns.heatmap( df.isnull(),cmap=sns.cubehelix_palette(as_cmap=True))
Out[16]:
<Axes: >
In [17]:
print('There are', len(cat_cols), 'categorical features, including:', "\n", cat_cols, '\n')
# Extract details on categorical features
for i in cat_cols:
    unique_no = df[i].nunique()
    unique_name = df[i].unique().tolist()
    print(i, 'has', unique_no, 'unique variables, including:')
    print(unique_name, "\n")
There are 6 categorical features, including: 
 ['Location', 'WindGustDir', 'WindDir9am', 'WindDir3pm', 'RainToday', 'RainTomorrow'] 

Location has 49 unique variables, including:
['Albury', 'BadgerysCreek', 'Cobar', 'CoffsHarbour', 'Moree', 'Newcastle', 'NorahHead', 'NorfolkIsland', 'Penrith', 'Richmond', 'Sydney', 'SydneyAirport', 'WaggaWagga', 'Williamtown', 'Wollongong', 'Canberra', 'Tuggeranong', 'MountGinini', 'Ballarat', 'Bendigo', 'Sale', 'MelbourneAirport', 'Melbourne', 'Mildura', 'Nhil', 'Portland', 'Watsonia', 'Dartmoor', 'Brisbane', 'Cairns', 'GoldCoast', 'Townsville', 'Adelaide', 'MountGambier', 'Nuriootpa', 'Woomera', 'Albany', 'Witchcliffe', 'PearceRAAF', 'PerthAirport', 'Perth', 'SalmonGums', 'Walpole', 'Hobart', 'Launceston', 'AliceSprings', 'Darwin', 'Katherine', 'Uluru'] 

WindGustDir has 16 unique variables, including:
['W', 'WNW', 'WSW', 'NE', 'NNW', 'N', 'NNE', 'SW', 'ENE', 'SSE', 'S', 'NW', 'SE', 'ESE', nan, 'E', 'SSW'] 

WindDir9am has 16 unique variables, including:
['W', 'NNW', 'SE', 'ENE', 'SW', 'SSE', 'S', 'NE', nan, 'SSW', 'N', 'WSW', 'ESE', 'E', 'NW', 'WNW', 'NNE'] 

WindDir3pm has 16 unique variables, including:
['WNW', 'WSW', 'E', 'NW', 'W', 'SSE', 'ESE', 'ENE', 'NNW', 'SSW', 'SW', 'SE', 'N', 'S', 'NNE', nan, 'NE'] 

RainToday has 2 unique variables, including:
['No', 'Yes', nan] 

RainTomorrow has 2 unique variables, including:
['No', 'Yes'] 

Summary of categorical data¶

In [18]:
ncols=3
nrows= int(np.floor(len(cat_cols)/ncols) + np.ceil(len(cat_cols)%ncols/ncols))
fig, axs = plt.subplots(nrows, ncols, figsize=(ncols*6, nrows*3))           
                                                
for row in range(nrows):
    for column in range(ncols):
        try:
            feature = cat_cols[row*ncols+column]
            sns.countplot(y=feature, data=df, ax=axs[row, column], color='#99befd')
        except:
            pass
plt.tight_layout(pad=0.5)

A quick glance at Numerical Features¶

In [19]:
df.describe()
Out[19]:
MinTemp MaxTemp Rainfall Evaporation Sunshine WindGustSpeed WindSpeed9am WindSpeed3pm Humidity9am Humidity3pm Pressure3pm Cloud9am Cloud3pm
count 141556.000000 141871.000000 140787.000000 81350.000000 74377.000000 132923.000000 140845.000000 139563.000000 140419.000000 138583.000000 128212.000000 88536.000000 85099.000000
mean 12.186400 23.226784 2.349974 5.469824 7.624853 39.984292 14.001988 18.637576 68.843810 51.482606 1015.258204 4.437189 4.503167
std 6.403283 7.117618 8.465173 4.188537 3.781525 13.588801 8.893337 8.803345 19.051293 20.797772 7.036677 2.887016 2.720633
min -8.500000 -4.800000 0.000000 0.000000 0.000000 6.000000 0.000000 0.000000 0.000000 0.000000 977.100000 0.000000 0.000000
25% 7.600000 17.900000 0.000000 2.600000 4.900000 31.000000 7.000000 13.000000 57.000000 37.000000 1010.400000 1.000000 2.000000
50% 12.000000 22.600000 0.000000 4.800000 8.500000 39.000000 13.000000 19.000000 70.000000 52.000000 1015.200000 5.000000 5.000000
75% 16.800000 28.200000 0.800000 7.400000 10.600000 48.000000 19.000000 24.000000 83.000000 66.000000 1020.000000 7.000000 7.000000
max 33.900000 48.100000 371.000000 145.000000 14.500000 135.000000 130.000000 87.000000 100.000000 100.000000 1039.600000 9.000000 9.000000

Plots on numerical features to check data quality and data distribution¶

In [20]:
num_cols=df.select_dtypes(include=['int64','float64']).columns.tolist() # a revised list of numerical features  
for i in num_cols:   
    fig, axs = plt.subplots(1,2,figsize=(15, 3))
    sns.histplot(df[i],bins=20, kde=True,ax=axs[0]);
    sns.boxplot(df[i], ax = axs[1], color='#99befd', fliersize=1);    
    axs[0].axvline(df[i].median(), color='r', linewidth=2, linestyle='--', label='Mean')
    axs[0].legend()

IQR¶

In [21]:
plt.figure(figsize=[25,15])
Original_Data.boxplot(column= ['MinTemp', 'MaxTemp', 'Rainfall', 'Evaporation', 'Sunshine', 'WindGustSpeed', 'WindSpeed9am', 'WindSpeed3pm', 'Humidity9am', 'Humidity3pm', 'Pressure9am', 'Pressure3pm', 'Cloud9am', 'Cloud3pm', 'Temp9am', 'Temp3pm'])
plt.xticks(rotation=45)
plt.show()

Na Removed¶

In [22]:
df_na_rm=df_na_rm.dropna(axis=0)
df_na_rm.isnull().sum()
Out[22]:
Date             0
Location         0
MinTemp          0
MaxTemp          0
Rainfall         0
Evaporation      0
Sunshine         0
WindGustDir      0
WindGustSpeed    0
WindDir9am       0
WindDir3pm       0
WindSpeed9am     0
WindSpeed3pm     0
Humidity9am      0
Humidity3pm      0
Pressure3pm      0
Cloud9am         0
Cloud3pm         0
RainToday        0
RainTomorrow     0
dtype: int64
In [23]:
df_na_rm.shape
Out[23]:
(56452, 20)

Outlaires removed and Columns droped that have Sigmificant null values¶

In [24]:
df_outliers__colna_rm=df_outliers__colna_rm.drop(['Evaporation', 'Sunshine', 'Cloud9am', 'Cloud3pm'], axis=1)
df_outliers__colna_rm=df_outliers__colna_rm.dropna(axis=0)
df_outliers__colna_rm.isnull().sum()
Out[24]:
Date             0
Location         0
MinTemp          0
MaxTemp          0
Rainfall         0
WindGustDir      0
WindGustSpeed    0
WindDir9am       0
WindDir3pm       0
WindSpeed9am     0
WindSpeed3pm     0
Humidity9am      0
Humidity3pm      0
Pressure3pm      0
RainToday        0
RainTomorrow     0
dtype: int64
In [25]:
df_outliers__colna_rm  =df_outliers__colna_rm  .dropna(axis=0)
num_cols=df_outliers__colna_rm  .select_dtypes(include=['int64','float64']).columns.tolist() # a revised list of numerical features
for i in num_cols:
    q1=df_outliers__colna_rm  [i].quantile(0.25)
    q3=df_outliers__colna_rm  [i].quantile(0.75)
    iqr=q3-q1
    upper_limit=q3+(1.5*iqr)
    lower_limit=q1-(1.5*iqr)
    
    # find the outliers
    #df.loc[(df[i] > upper_limit) | (df[i] < lower_limit)]
    # trimming - delete the outlier data
    new_df_outliers__colna_rm   = df_outliers__colna_rm  .loc[(df_outliers__colna_rm  [i] <= upper_limit) & (df_outliers__colna_rm  [i] >= lower_limit)]
    print('before removing outliers:', len(df_outliers__colna_rm  ))
    print('after removing outliers:',len(new_df_outliers__colna_rm  ))
    print('outliers:', len(df_outliers__colna_rm  )-len(new_df_outliers__colna_rm  ))
before removing outliers: 113011
after removing outliers: 112985
outliers: 26
before removing outliers: 113011
after removing outliers: 112935
outliers: 76
before removing outliers: 113011
after removing outliers: 92647
outliers: 20364
before removing outliers: 113011
after removing outliers: 110380
outliers: 2631
before removing outliers: 113011
after removing outliers: 111019
outliers: 1992
before removing outliers: 113011
after removing outliers: 110812
outliers: 2199
before removing outliers: 113011
after removing outliers: 111518
outliers: 1493
before removing outliers: 113011
after removing outliers: 113011
outliers: 0
before removing outliers: 113011
after removing outliers: 112093
outliers: 918
In [26]:
df_outliers__colna_rm.shape
Out[26]:
(113011, 16)

Mean Imputed Outlieres Removed¶

In [27]:
num_cols=df_na_mean_imp.select_dtypes(include=['int64','float64']).columns.tolist() # a revised list of numerical features
cat_cols=df_na_mean_imp.select_dtypes(include=['category','object',"datetime64[ns]"]).columns.tolist() # a revised list of numerical features
for i in num_cols:
    q1=df_na_mean_imp[i].quantile(0.25)
    q3=df_na_mean_imp[i].quantile(0.75)
    iqr=q3-q1
    upper_limit=q3+(1.5*iqr)
    lower_limit=q1-(1.5*iqr)
new_df_na_mean_imp = df_na_mean_imp.loc[(df_na_mean_imp[i] <= upper_limit) & (df_na_mean_imp[i] >= lower_limit)]
df_na_mean_imp=new_df_na_mean_imp  


# Impute missing values for categorical features
#mode_values=dd[cat_cols].mode()
#dd[cat_cols] = dd[cat_cols].fillna(value=mode_values[0])
df_na_mean_imp["WindGustDir"] = df_na_mean_imp["WindGustDir"].fillna(df_na_mean_imp["WindGustDir"].mode()[0])
df_na_mean_imp["WindDir9am"] = df_na_mean_imp["WindDir9am"].fillna(df_na_mean_imp["WindDir9am"].mode()[0])
df_na_mean_imp["WindDir3pm"] = df_na_mean_imp["WindDir3pm"].fillna(df_na_mean_imp["WindDir3pm"].mode()[0])
df_na_mean_imp["RainToday"] = df_na_mean_imp["RainToday"].fillna(df_na_mean_imp["RainToday"].mode()[0])


# Impute missing values for numerical features
median_values = df_na_mean_imp[num_cols].median()
df_na_mean_imp[num_cols] = df_na_mean_imp[num_cols].fillna(value=median_values)

df_na_mean_imp.shape
Out[27]:
(85099, 20)
In [28]:
df_na_mean_imp.isnull().sum()
Out[28]:
Date             0
Location         0
MinTemp          0
MaxTemp          0
Rainfall         0
Evaporation      0
Sunshine         0
WindGustDir      0
WindGustSpeed    0
WindDir9am       0
WindDir3pm       0
WindSpeed9am     0
WindSpeed3pm     0
Humidity9am      0
Humidity3pm      0
Pressure3pm      0
Cloud9am         0
Cloud3pm         0
RainToday        0
RainTomorrow     0
dtype: int64

Imputaion With Mice¶

Predictive Mean Matching¶

Imputaion with Outlieris removed¶

df_imputed_outliers_removed=df df_imputed_outliers_removed=df_imputed_outliers_removed.drop(['Evaporation', 'Sunshine', 'Cloud9am', 'Cloud3pm'], axis=1)

iqr¶

num_cols=df_imputed_outliers_removed.select_dtypes(include=['int64','float64']).columns.tolist() # a revised list of numerical features cat_cols=df_imputed_outliers_removed.select_dtypes(include=['category','object',"datetime64[ns]"]).columns.tolist() # a revised list of numerical features

for i in num_cols: q1=df_imputed_outliers_removed[i].quantile(0.25) q3=df_imputed_outliers_removed[i].quantile(0.75) iqr=q3-q1 upper_limit=q3+(1.5iqr) lower_limit=q1-(1.5iqr) new_df_imputed_outliers_removed = df_imputed_outliers_removed.loc[(df_imputed_outliers_removed[i] <= upper_limit) & (df_imputed_outliers_removed[i] >= lower_limit)] df_imputed_outliers_removed=new_df_imputed_outliers_removed
df_imputed_outliers_removed['WindGustDir'] = pd.Categorical(df_imputed_outliers_removed.WindGustDir) df_imputed_outliers_removed['WindDir9am'] = pd.Categorical(df_imputed_outliers_removed.WindDir9am) df_imputed_outliers_removed['WindDir3pm'] = pd.Categorical(df_imputed_outliers_removed.WindDir3pm) df_imputed_outliers_removed['RainToday'] = pd.Categorical(df_imputed_outliers_removed.RainToday) df_imputed_outliers_removed['RainTomorrow'] = pd.Categorical(df_imputed_outliers_removed.RainTomorrow) Date=df_imputed_outliers_removed['Date'] Location=df_imputed_outliers_removed["Location"] RainTomorrow=df_imputed_outliers_removed["RainTomorrow"] df_imputed_outliers_removed=df_imputed_outliers_removed.drop(["Date",'RainTomorrow',"Location"], axis=1)

import miceforest as mf kds=mf.ImputationKernel(df_imputed_outliers_removed,datasets=5,save_all_iterations=True,random_state=11) kds.mice(6)

kds.complete_data(4)¶

finalresult2=pd.concat([kds.complete_data(j) for j in range(5)]).groupby(level=0).mean()

df_imputed_outliers_removed=pd.concat([Date,Location,finalresult2,RainTomorrow],axis=1) df_imputed_outliers_removed.shape

EDA¶

In [29]:
plt.figure(figsize = (10, 10))
sns.histplot(x = 'WindSpeed3pm', hue = 'RainTomorrow', data = Original_Data , kde=True )
Out[29]:
<Axes: xlabel='WindSpeed3pm', ylabel='Count'>

Here As the wind speed at 3am increases so does the chances of raining tomorrow increases.

In [30]:
plt.figure(figsize = (10, 10))
sns.histplot(x = 'Sunshine', hue = 'RainTomorrow', data = Original_Data , kde=True )
Out[30]:
<Axes: xlabel='Sunshine', ylabel='Count'>

Here we can say that as the sunshine increases the chances of raining tomorrow decreses

In [31]:
plt.figure(figsize = (10, 10))
sns.histplot(x = 'Humidity9am', hue = 'RainTomorrow', data = Original_Data , kde=True )
Out[31]:
<Axes: xlabel='Humidity9am', ylabel='Count'>

Here As we can see that as Humidity at 9 am increases so does the chances of raining tomorrow increases.

In [32]:
plt.figure(figsize = (10, 10))
sns.histplot(x = 'Humidity3pm', hue = 'RainTomorrow', data = Original_Data , kde=True )
Out[32]:
<Axes: xlabel='Humidity3pm', ylabel='Count'>

Here we can see that as the Humidity at 3pm increases the chances of raining tomorrow Increses a lot

In [33]:
plt.figure(figsize = (10, 10))
sns.histplot(x = 'Pressure3pm', hue = 'RainTomorrow', data = Original_Data , kde=True )
Out[33]:
<Axes: xlabel='Pressure3pm', ylabel='Count'>
In [ ]:
 
In [34]:
plt.figure(figsize = (10, 10))
sns.histplot(x = 'Temp3pm', hue = 'RainTomorrow', data = Original_Data , kde=True )
Out[34]:
<Axes: xlabel='Temp3pm', ylabel='Count'>
In [35]:
plt.figure(figsize = (10, 10))
sns.histplot(x = 'Cloud3pm', hue = 'RainTomorrow', data = Original_Data , kde=True )
Out[35]:
<Axes: xlabel='Cloud3pm', ylabel='Count'>

When the percentage of clouds increases, the probability of rain increases tomorrow

In [36]:
Original_Data.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 145460 entries, 0 to 145459
Data columns (total 23 columns):
 #   Column         Non-Null Count   Dtype         
---  ------         --------------   -----         
 0   Date           145460 non-null  datetime64[ns]
 1   Location       145460 non-null  object        
 2   MinTemp        143975 non-null  float64       
 3   MaxTemp        144199 non-null  float64       
 4   Rainfall       142199 non-null  float64       
 5   Evaporation    82670 non-null   float64       
 6   Sunshine       75625 non-null   float64       
 7   WindGustDir    135134 non-null  object        
 8   WindGustSpeed  135197 non-null  float64       
 9   WindDir9am     134894 non-null  object        
 10  WindDir3pm     141232 non-null  object        
 11  WindSpeed9am   143693 non-null  float64       
 12  WindSpeed3pm   142398 non-null  float64       
 13  Humidity9am    142806 non-null  float64       
 14  Humidity3pm    140953 non-null  float64       
 15  Pressure9am    130395 non-null  float64       
 16  Pressure3pm    130432 non-null  float64       
 17  Cloud9am       89572 non-null   float64       
 18  Cloud3pm       86102 non-null   float64       
 19  Temp9am        143693 non-null  float64       
 20  Temp3pm        141851 non-null  float64       
 21  RainToday      142199 non-null  object        
 22  RainTomorrow   142193 non-null  object        
dtypes: datetime64[ns](1), float64(16), object(6)
memory usage: 25.5+ MB
In [37]:
import plotly.express as px
px.scatter(Original_Data.sample(2000),
           title='Min Temp. vs Max Temp.',
           x='MinTemp',y='MaxTemp',color='RainToday')

It shows a linear positive correlation between minimum temperature and maximum temperature.

In [38]:
sns.scatterplot(data=Original_Data,x='WindSpeed9am',y='WindSpeed3pm',hue='RainTomorrow')
Out[38]:
<Axes: xlabel='WindSpeed9am', ylabel='WindSpeed3pm'>
In [39]:
sns.barplot(data=Original_Data, x="RainTomorrow", y="Rainfall")
Out[39]:
<Axes: xlabel='RainTomorrow', ylabel='Rainfall'>
In [40]:
b=sns.countplot(x= 'WindGustDir' ,data = Original_Data ,palette='ocean'  )
plt.show()

Wind Gust Direction for maximum records(nearly 12000) is West

In [41]:
b=sns.countplot(x= 'WindDir9am' ,data = Original_Data ,palette='coolwarm'  )
plt.show()

Wind Direction at 9AM for maximum records is North followed by North-West and East.

In [42]:
b=sns.countplot(x= 'WindDir3pm' ,data = Original_Data ,palette='BuGn_r'  )
plt.show()

Wind Direction at 3PM for maximum records is South East

In [43]:
px.histogram(Original_Data, x='Location', 
             title='Location vs. Rainy Days', 
             color='RainToday')

Nhil, Darwin, Uluru reports least rain today ,canberra receives most rain among 49 locations.

In [44]:
plt.figure(figsize=(10,8))
plt.scatter(Original_Data['Location'],Original_Data['Rainfall'])
plt.xlabel("Location")
plt.xticks(rotation=90)
plt.ylabel("Rainfall")
plt.show()

Highest rain rates in Coffs Harbor and Darwin

In [45]:
px.scatter(Original_Data,
           title='Temp (3 pm) vs. Humidity (3 pm)',
           x='Temp3pm',
           y='Humidity3pm',
           color='RainTomorrow')

If the temperature today is low and humidity is high, it may rain tomorrow. If temperature today is high and humidity is low, it may not rain tomorrow

In [46]:
plt.figure(figsize=(10,10))
sns.countplot(x= 'WindGustDir' ,data =Original_Data ,palette='winter_r',hue='RainTomorrow')
plt.title('WindGustDir Vs RainTomorrow')
plt.show()

When the windGustDir in the west, more rain is expected

In [47]:
px.scatter(Original_Data,
           title='Temp (3 pm) vs. Humidity (3 pm)',
           x='Humidity3pm',
           y='Cloud3pm',
           color='RainTomorrow')
In [48]:
px.scatter(Original_Data,
           title='SunShine vs. Cloud (3 pm)',
           x='Sunshine',
           y='Cloud3pm',
           color='RainTomorrow')

Location: Which city has the most raining days?¶

In [49]:
rain_by_location =Original_Data.groupby('Location')['RainTomorrow'].count()/Original_Data['Location'].count()
In [50]:
rain_by_location = pd.crosstab(index=df['Location'], columns=df['RainTomorrow'], values=df['RainTomorrow'], aggfunc='count', margins=True)
rain_by_location['% Yes'] = (rain_by_location['Yes']/rain_by_location['All']).round(3)*100
rain_by_location.sort_values(by='% Yes', ascending=False)
f, ax = plt.subplots(figsize=(15,10))
rain_by_location['% Yes'].sort_values().plot(kind='barh', alpha=0.5)
ax.set_xlabel ('% raining days')
y = rain_by_location['% Yes'].sort_values().values
for h, v in enumerate(y):
    ax.text(v+0.5 , h-0.5 , round(float(v),1), color='blue')

Portland, Walpole, Cairns are the top3 locations in terms of number of raining days

Woomera, Uluru, AliceSprings are the bottom 3 of the list, with less than 10% of raining days

Whether MinTemp and MaxTemp impact RainTomorrow?¶

In [51]:
fig, ax = plt.subplots()
sns.scatterplot(x='MinTemp', y='MaxTemp', data=Original_Data, hue='RainTomorrow', alpha=0.5, style='RainTomorrow')
x = y = plt.xlim()
plt.plot(x, y, linestyle='--', color='g', lw=2, scalex=False, scaley=False)
plt.annotate('MaxTemp=MinTemp', xy=(30,30), xytext=(30,28), color='g')
Out[51]:
Text(30, 28, 'MaxTemp=MinTemp')

MaxTemp and MinTemp do not seem to directly impact the chance of raining tomorrow

However, when we draw a MaxTemp = MinTemp line, it seems that most of the 'Yes' result is falling close to this line

This implies that there is a higher chance to rain tomorrow if there is little variation between the max and min temperature

We will verify this by adding a new feature TempDiff to the dataset (TempDiff = MaxTemp - MinTemp)

In [52]:
# Adding a new feature 'TempDiff'
df['TempDiff'] = df['MaxTemp'] - df['MinTemp']
# TempDiff distribution
sns.histplot(x='TempDiff', data=df, bins=20, alpha=0.5, label='All RainTomorrow data')
df[df['RainTomorrow']=='Yes']['TempDiff'].plot.hist(bins=20, color='red', alpha=0.3, label='RainTomorrow = Yes')
plt.legend()
Out[52]:
<matplotlib.legend.Legend at 0x17fe80880>

It can be easily seen from the chart that when tempeature different is less than 5C, there is a higher chance of raining tomorrow.It can be easily seen from the chart that when tempeature different is less than 5C, there is a higher chance of raining tomorrow.

How about wind speed and direction?¶

There are 3 different numerical features (WindGustSpeed,WindSpeed9am and WindSpeed3pm) that are associated with wind speed.

There are 3 categorized features (WindGustDir,WindDir9am and WindDir3pm) that are associated with wind direction.

There are 3 categorized features (WindGustDir,WindDir9am and WindDir3pm) that are associated with wind direction.

Can we draw any conclusion from these values?

In [53]:
# Draw scatter charts with different wind speed data
wind_speed = ['WindGustSpeed', 'WindSpeed9am', 'WindSpeed3pm']
wind_speed_combination = [i for i in combinations(wind_speed,2)]
fig, axs = plt.subplots(1,3,figsize=(15, 4))
for i, ws in enumerate(wind_speed_combination):
    sns.scatterplot(x=ws[0], y=ws[1], data=Original_Data, hue='RainTomorrow', ax=axs[i], alpha=0.5)

WindGustSpeed seems to be a more important factor than WindSpeed9am and WindSpeed3pm

There is a higher chance of raining tomorrow when WindGustSpeedis higher than 75WindGustSpeed seems

In [54]:
# Relationship between humidity/pressure and RainTomorrow
fig, axs = plt.subplots(1,2,figsize=(15, 4))
sns.scatterplot(x='Humidity9am', y='Humidity3pm', data=Original_Data, hue='RainTomorrow', alpha=0.5, ax=axs[0])
sns.scatterplot(x='Humidity9am', y='Pressure3pm', data=Original_Data, hue='RainTomorrow', alpha=0.5, ax=axs[1])
Out[54]:
<Axes: xlabel='Humidity9am', ylabel='Pressure3pm'>

Higher chance of raining tomorrow with higher humidity and lower pressure

In [55]:
# Extract `Year` and 'Month' information from Date
Original_Data['Year'] = pd.DatetimeIndex(Original_Data['Date']).year
Original_Data['Month'] = pd.DatetimeIndex(Original_Data['Date']).month
rain_month = pd.crosstab(index=Original_Data['Month'], columns=Original_Data['RainTomorrow'], margins=True)
rain_month['%Yes'] = (rain_month['Yes'] / rain_month['All']).round(3)*100 
rain_month.iloc[:-1,-1].plot(style='.-')
plt.xlabel('Month')
plt.ylabel('% Raining days')
Out[55]:
Text(0, 0.5, '% Raining days')

Higher chance of raining between June and August

In [56]:
Original_Data.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 145460 entries, 0 to 145459
Data columns (total 25 columns):
 #   Column         Non-Null Count   Dtype         
---  ------         --------------   -----         
 0   Date           145460 non-null  datetime64[ns]
 1   Location       145460 non-null  object        
 2   MinTemp        143975 non-null  float64       
 3   MaxTemp        144199 non-null  float64       
 4   Rainfall       142199 non-null  float64       
 5   Evaporation    82670 non-null   float64       
 6   Sunshine       75625 non-null   float64       
 7   WindGustDir    135134 non-null  object        
 8   WindGustSpeed  135197 non-null  float64       
 9   WindDir9am     134894 non-null  object        
 10  WindDir3pm     141232 non-null  object        
 11  WindSpeed9am   143693 non-null  float64       
 12  WindSpeed3pm   142398 non-null  float64       
 13  Humidity9am    142806 non-null  float64       
 14  Humidity3pm    140953 non-null  float64       
 15  Pressure9am    130395 non-null  float64       
 16  Pressure3pm    130432 non-null  float64       
 17  Cloud9am       89572 non-null   float64       
 18  Cloud3pm       86102 non-null   float64       
 19  Temp9am        143693 non-null  float64       
 20  Temp3pm        141851 non-null  float64       
 21  RainToday      142199 non-null  object        
 22  RainTomorrow   142193 non-null  object        
 23  Year           145460 non-null  int64         
 24  Month          145460 non-null  int64         
dtypes: datetime64[ns](1), float64(16), int64(2), object(6)
memory usage: 27.7+ MB
In [57]:
def plot_df(df, x, y, title="", xlabel='Date', ylabel='Rainfall', dpi=1000):
    plt.figure(figsize=(16,5), dpi=dpi)
    plt.plot(x, y, color='steelblue')
    plt.gca().set(title=title, xlabel=xlabel, ylabel=ylabel)
    plt.show()
    

plot_df(Original_Data, x=Original_Data['Date'], y=Original_Data['Rainfall'], title='Rainfall')

We are going to plot features with datetime. Here, I am going to use date from last 3 years.

In [58]:
Original_Data_dateplot = Original_Data.iloc[-950:,:]
plt.figure(figsize=[16,5])
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['MinTemp'],color='blue',linewidth=1, label= 'MinTemp')
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['MaxTemp'],color='red',linewidth=1, label= 'MaxTemp')
plt.fill_between(Original_Data_dateplot['Date'],Original_Data_dateplot['MinTemp'],Original_Data_dateplot['MaxTemp'], facecolor = '#EBF78F')
plt.title('MinTemp vs MaxTemp by Date')
plt.legend(loc='lower left', frameon=False)
plt.show()

Above plot shows that the MinTemp and MaxTemp relatively increases and decreases every year.

The weather conditions are always opposite in the two hemispheres. As, the Australia is situated in the southern hemisphere. The seasons are bit different.

As you can see that, December to February is summer; March to May is autumn; June to August is winter; and September to November is spring.

In [59]:
Original_Data_dateplot
Out[59]:
Date Location MinTemp MaxTemp Rainfall Evaporation Sunshine WindGustDir WindGustSpeed WindDir9am ... Pressure9am Pressure3pm Cloud9am Cloud3pm Temp9am Temp3pm RainToday RainTomorrow Year Month
144510 2014-11-19 Uluru 20.0 40.0 0.0 NaN NaN SSW 56.0 ENE ... 1014.7 1008.5 NaN 8.0 30.0 37.4 No No 2014 11
144511 2014-11-20 Uluru 24.2 39.0 0.0 NaN NaN SSW 52.0 ESE ... 1012.1 1007.0 NaN 1.0 28.5 36.6 No No 2014 11
144512 2014-11-21 Uluru 21.4 42.4 0.0 NaN NaN WNW 54.0 NNE ... 1009.6 1004.5 NaN 1.0 31.3 40.5 No No 2014 11
144513 2014-11-22 Uluru 21.2 42.1 0.0 NaN NaN W 76.0 NNW ... 1009.1 1004.7 1.0 3.0 33.3 39.5 No Yes 2014 11
144514 2014-11-23 Uluru 20.4 40.1 1.2 NaN NaN WNW 54.0 N ... 1009.4 1006.1 NaN 8.0 30.9 39.1 Yes No 2014 11
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
145455 2017-06-21 Uluru 2.8 23.4 0.0 NaN NaN E 31.0 SE ... 1024.6 1020.3 NaN NaN 10.1 22.4 No No 2017 6
145456 2017-06-22 Uluru 3.6 25.3 0.0 NaN NaN NNW 22.0 SE ... 1023.5 1019.1 NaN NaN 10.9 24.5 No No 2017 6
145457 2017-06-23 Uluru 5.4 26.9 0.0 NaN NaN N 37.0 SE ... 1021.0 1016.8 NaN NaN 12.5 26.1 No No 2017 6
145458 2017-06-24 Uluru 7.8 27.0 0.0 NaN NaN SE 28.0 SSE ... 1019.4 1016.5 3.0 2.0 15.1 26.0 No No 2017 6
145459 2017-06-25 Uluru 14.9 NaN 0.0 NaN NaN NaN NaN ESE ... 1020.2 1017.9 8.0 8.0 15.0 20.9 No NaN 2017 6

950 rows × 25 columns

Rainfall¶

In [60]:
plt.figure(figsize=[16,5])
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['Rainfall'],color='violet', linewidth=2, label= 'Rainfall')
plt.legend(loc='upper left', frameon=False)
plt.title('Rainfall by Date')
plt.show()

WindGustSpeed¶

In [61]:
plt.figure(figsize=[16,5])
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['WindGustSpeed'],color='violet', linewidth=2, label= 'WindGustSpeed')
plt.legend(loc='upper left', frameon=False)
plt.title('WindGustSpeed by Date')
plt.show()

In Australia, wind speed is usually moderate. But, from above plot we can see that Dec-Feb is the windiest months.¶

WindSpeed9am and WindSpeed3pm¶

In [62]:
plt.figure(figsize=[16,5])
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['WindSpeed9am'],color='blue', linewidth=2, label= 'WindSpeed9am')
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['WindSpeed3pm'],color='green', linewidth=2, label= 'WindSpeed3pm')
plt.legend(loc='upper left', frameon=False)
plt.title('WindSpeed9am vs WindSpeed3pm by Date')
plt.show()

WindSpeed9am and WindSpeed3pm are relatively same around certain months.¶

In [63]:
px.scatter(Original_Data,
           title='WindSpeed3pm vs. WindSpeed9am',
           x='WindSpeed3pm',
           y='WindSpeed9am',
           color='RainTomorrow', width=600, height=600)

Pressure9am and Pressure3am¶

In [64]:
plt.figure(figsize=[16,5])
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['Pressure9am'],color='blue', linewidth=2, label= 'Pressure9am')
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['Pressure3pm'],color='green', linewidth=2, label= 'Pressure3pm')
plt.fill_between(Original_Data_dateplot['Date'],Original_Data_dateplot['Pressure9am'],Original_Data_dateplot['Pressure3pm'], facecolor = '#EBF78F')
plt.legend(loc='upper left', frameon=False)
plt.title('Pressure9am vs Pressure3pm by Date')
plt.show()

Pressure is high around the months of Jun-Aug and around Dec-Jan you can see that the pressure is low.

In a low pressure area the rising air cools and this is likely to condense water vapour and form clouds, and consequently rain.

Temp9am and Temp3pm¶

In [65]:
plt.figure(figsize=[16,5])
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['Temp9am'],color='blue', linewidth=2, label= 'Temp9am')
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['Temp3pm'],color='green', linewidth=2, label= 'Temp3pm')
plt.fill_between(Original_Data_dateplot['Date'],Original_Data_dateplot['Temp9am'],Original_Data_dateplot['Temp3pm'], facecolor = '#EBF78F')
plt.legend(loc='lower left', frameon=False)
plt.title('Temp9am vs Temp3pm by Date')
plt.show()

in the above plots, that Dec-Jan are months when the temperature is high but these are the months when the difference between temperature around 9am and 3pm is less as compare to the months of Jun-Aug when the difference is high.

In [66]:
plt.figure(figsize=[16,5])
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['Humidity9am'],color='blue',linewidth=1, label= 'Humidity9am')
plt.plot(Original_Data_dateplot['Date'],Original_Data_dateplot['Humidity3pm'],color='red',linewidth=1, label= 'Humidity3pm')
plt.fill_between(Original_Data_dateplot['Date'],Original_Data_dateplot['Humidity9am'],Original_Data_dateplot['Humidity3pm'], facecolor = '#EBF78F')
plt.title('Humidity9am vs Humidity3am by Date')
plt.legend(loc='lower left', frameon=False)
plt.show()

Packages for machine learning¶

In [67]:
from sklearn.preprocessing import LabelEncoder
from sklearn.model_selection import train_test_split
from sklearn.model_selection import cross_val_score
from sklearn.model_selection import KFold
#from sklearn.linear_model import LogisticRegression
from sklearn.ensemble import RandomForestClassifier
from sklearn.neighbors import KNeighborsClassifier
from sklearn.svm import SVC
from sklearn.naive_bayes import GaussianNB
from sklearn.tree import DecisionTreeClassifier
from sklearn.metrics import accuracy_score 
from sklearn.metrics import precision_score
from sklearn.metrics import recall_score
from sklearn.metrics import f1_score
#from sklearn.metrics import roc_auc_score
#from sklearn.impute import KNNImputer

ML¶

In [68]:
df_na_mean_imp['TempDiff'] = df_na_mean_imp['MaxTemp'] - df_na_mean_imp['MinTemp']
df_na_mean_imp=df_na_mean_imp.drop(['MaxTemp','MinTemp'],axis=1)
In [69]:
cols = df_na_mean_imp.columns.tolist()
cols = cols[-1:] + cols[:-1]
df_na_mean_imp =df_na_mean_imp[cols]
df_na_mean_imp.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 85099 entries, 2 to 145458
Data columns (total 19 columns):
 #   Column         Non-Null Count  Dtype         
---  ------         --------------  -----         
 0   TempDiff       85099 non-null  float64       
 1   Date           85099 non-null  datetime64[ns]
 2   Location       85099 non-null  object        
 3   Rainfall       85099 non-null  float64       
 4   Evaporation    85099 non-null  float64       
 5   Sunshine       85099 non-null  float64       
 6   WindGustDir    85099 non-null  object        
 7   WindGustSpeed  85099 non-null  float64       
 8   WindDir9am     85099 non-null  object        
 9   WindDir3pm     85099 non-null  object        
 10  WindSpeed9am   85099 non-null  float64       
 11  WindSpeed3pm   85099 non-null  float64       
 12  Humidity9am    85099 non-null  float64       
 13  Humidity3pm    85099 non-null  float64       
 14  Pressure3pm    85099 non-null  float64       
 15  Cloud9am       85099 non-null  float64       
 16  Cloud3pm       85099 non-null  float64       
 17  RainToday      85099 non-null  object        
 18  RainTomorrow   85099 non-null  object        
dtypes: datetime64[ns](1), float64(12), object(6)
memory usage: 13.0+ MB
In [70]:
fig, ax = plt.subplots(figsize=(12,8))
mask = np.triu(np.ones_like(df_na_mean_imp .corr(), dtype=np.bool_))
sns.heatmap(df_na_mean_imp .corr(), annot=True, cmap="Blues", mask=mask, linewidth=0.5)
Out[70]:
<Axes: >
In [71]:
df_na_mean_imp.info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 85099 entries, 2 to 145458
Data columns (total 19 columns):
 #   Column         Non-Null Count  Dtype         
---  ------         --------------  -----         
 0   TempDiff       85099 non-null  float64       
 1   Date           85099 non-null  datetime64[ns]
 2   Location       85099 non-null  object        
 3   Rainfall       85099 non-null  float64       
 4   Evaporation    85099 non-null  float64       
 5   Sunshine       85099 non-null  float64       
 6   WindGustDir    85099 non-null  object        
 7   WindGustSpeed  85099 non-null  float64       
 8   WindDir9am     85099 non-null  object        
 9   WindDir3pm     85099 non-null  object        
 10  WindSpeed9am   85099 non-null  float64       
 11  WindSpeed3pm   85099 non-null  float64       
 12  Humidity9am    85099 non-null  float64       
 13  Humidity3pm    85099 non-null  float64       
 14  Pressure3pm    85099 non-null  float64       
 15  Cloud9am       85099 non-null  float64       
 16  Cloud3pm       85099 non-null  float64       
 17  RainToday      85099 non-null  object        
 18  RainTomorrow   85099 non-null  object        
dtypes: datetime64[ns](1), float64(12), object(6)
memory usage: 13.0+ MB
In [72]:
#df_na_mean_imp=pd.concat([Date,Location,finalresult2,RainTomorrow],axis=1)
df_na_mean_imp.shape
Out[72]:
(85099, 19)
In [73]:
le = LabelEncoder()
df_na_mean_imp[cat_cols] =df_na_mean_imp[cat_cols].astype('str').apply(le.fit_transform)
In [74]:
df_na_mean_imp.isnull().sum()
Out[74]:
TempDiff         0
Date             0
Location         0
Rainfall         0
Evaporation      0
Sunshine         0
WindGustDir      0
WindGustSpeed    0
WindDir9am       0
WindDir3pm       0
WindSpeed9am     0
WindSpeed3pm     0
Humidity9am      0
Humidity3pm      0
Pressure3pm      0
Cloud9am         0
Cloud3pm         0
RainToday        0
RainTomorrow     0
dtype: int64
In [75]:
df_na_mean_imp.head()
Out[75]:
TempDiff Date Location Rainfall Evaporation Sunshine WindGustDir WindGustSpeed WindDir9am WindDir3pm WindSpeed9am WindSpeed3pm Humidity9am Humidity3pm Pressure3pm Cloud9am Cloud3pm RainToday RainTomorrow
2 12.8 398 1 0.0 4.8 8.5 15 46.0 13 15 19.0 26.0 38.0 30.0 1008.7 5.0 2.0 0 0
4 14.8 400 1 1.0 4.8 8.5 13 41.0 1 7 7.0 20.0 82.0 33.0 1006.0 7.0 8.0 0 0
11 5.8 407 1 2.2 4.8 8.5 5 31.0 4 1 15.0 13.0 89.0 91.0 1004.2 8.0 8.0 1 1
12 2.7 408 1 15.6 4.8 8.5 13 61.0 6 6 28.0 28.0 76.0 93.0 993.0 8.0 8.0 1 1
13 8.4 409 1 3.6 4.8 8.5 12 44.0 13 11 24.0 20.0 65.0 43.0 1001.8 5.0 7.0 1 0
In [76]:
X = df_na_mean_imp.iloc[:, :-1]
y = df_na_mean_imp.iloc[:, -1:]

X_train, X_test, y_train, y_test = train_test_split(X, y,
                                                   test_size=0.3, random_state = 1)
In [77]:
#Logistic Regression
from sklearn.metrics import classification_report, confusion_matrix
from sklearn.linear_model import LogisticRegression
lr = LogisticRegression()
lr.fit(X_train,y_train)
y_pred=predictions = lr.predict(X_test)
lr.score(X_train,y_train)
Out[77]:
0.83533381456798
In [78]:
Results=[]
r="df_na_mean_imp"
Results.append(r)
r={"Logistic Regression":83.533381}
Results.append(r)
Results
Out[78]:
['df_na_mean_imp', {'Logistic Regression': 83.533381}]
In [79]:
print(confusion_matrix(y_test, predictions))
cm = confusion_matrix(y_test, predictions)

fig, ax = plt.subplots(figsize=(4, 4))
ax.imshow(cm)
ax.grid(False)
ax.xaxis.set(ticks=(0, 1), ticklabels=('Predicted 0s', 'Predicted 1s'))
ax.yaxis.set(ticks=(0, 1), ticklabels=('Actual 0s', 'Actual 1s'))
ax.set_ylim(1.5, -0.5)
for i in range(2):
    for j in range(2):
        ax.text(j, i, cm[i, j], ha='center', va='center', color='red')
plt.show()
[[18236  1257]
 [ 2966  3071]]
In [80]:
from sklearn.metrics import mean_absolute_error, mean_squared_error
rmse = mean_squared_error(y_test,y_pred, squared=False)
mae= mean_absolute_error(y_test, y_pred)
mse= mean_squared_error(y_test, y_pred)
print(f"Mean Absolute Error: {mae:.3f}")
print(f"Mean Squared Error: {mse:.3f}")
print(f"Root Mean Squared Error: {rmse:.3f}")
Mean Absolute Error: 0.165
Mean Squared Error: 0.165
Root Mean Squared Error: 0.407
In [81]:
    print(classification_report(y_test, predictions))
              precision    recall  f1-score   support

           0       0.86      0.94      0.90     19493
           1       0.71      0.51      0.59      6037

    accuracy                           0.83     25530
   macro avg       0.78      0.72      0.74     25530
weighted avg       0.82      0.83      0.82     25530

In [82]:
#ROC Curve
from sklearn.metrics import roc_auc_score
from sklearn.metrics import roc_curve
logit_roc_auc = roc_auc_score(y_test, predictions)
fpr, tpr, thresholds = roc_curve(y_test, lr.predict_proba(X_test)[:,1])
plt.figure()
plt.plot(fpr, tpr, label='Logistic Regression (area = %0.2f)' % logit_roc_auc)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.savefig('Log_ROC')
plt.show()
In [83]:
from xgboost import XGBClassifier
xgb_model = XGBClassifier().fit(X_train, y_train)
y_pred = xgb_model.predict(X_test)
accuracy_score(y_test, y_pred)
Out[83]:
0.8524481002741873
In [84]:
r={"xgboost":85.244810}
Results.append(r)
Results
Out[84]:
['df_na_mean_imp', {'Logistic Regression': 83.533381}, {'xgboost': 85.24481}]
In [85]:
rmse = mean_squared_error(y_test,y_pred, squared=False)
mae= mean_absolute_error(y_test, y_pred)
mse= mean_squared_error(y_test, y_pred)
print(f"Mean Absolute Error: {mae:.3f}")
print(f"Mean Squared Error: {mse:.3f}")
print(f"Root Mean Squared Error: {rmse:.3f}")
Mean Absolute Error: 0.148
Mean Squared Error: 0.148
Root Mean Squared Error: 0.384
In [86]:
print(confusion_matrix(y_test, y_pred))
cm = confusion_matrix(y_test, y_pred)

fig, ax = plt.subplots(figsize=(4, 4))
ax.imshow(cm)
ax.grid(False)
ax.xaxis.set(ticks=(0, 1), ticklabels=('Predicted 0s', 'Predicted 1s'))
ax.yaxis.set(ticks=(0, 1), ticklabels=('Actual 0s', 'Actual 1s'))
ax.set_ylim(1.5, -0.5)
for i in range(2):
    for j in range(2):
        ax.text(j, i, cm[i, j], ha='center', va='center', color='red')
plt.show()
[[18267  1226]
 [ 2541  3496]]
In [87]:
# Compute micro-average ROC curve and ROC area
xgb_roc_auc = roc_auc_score(y_test, y_pred)
fpr, tpr, thresholds = roc_curve(y_test, xgb_model.predict_proba(X_test)[:,1])
plt.figure()
plt.plot(fpr, tpr, label='xgb_model (area = %0.2f)' % logit_roc_auc)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.savefig('Log_ROC')
plt.show()
In [88]:
print(classification_report(y_test, y_pred))
              precision    recall  f1-score   support

           0       0.88      0.94      0.91     19493
           1       0.74      0.58      0.65      6037

    accuracy                           0.85     25530
   macro avg       0.81      0.76      0.78     25530
weighted avg       0.85      0.85      0.85     25530

RandomForestClassifier¶

In [89]:
rf= RandomForestClassifier()
rf.fit(X_train, y_train)
y_pred = rf.predict(X_test)
In [90]:
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)
print('Mean Absolute Error:', mean_absolute_error(y_test, y_pred))
print('Mean Squared Error:', mean_squared_error(y_test, y_pred))
print('Root Mean Squared Error:', np.sqrt(mean_squared_error(y_test, y_pred)))
Accuracy: 0.8482961222091657
Mean Absolute Error: 0.1517038777908343
Mean Squared Error: 0.1517038777908343
Root Mean Squared Error: 0.3894918199280112
In [91]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.87      0.95      0.90     19493
           1       0.75      0.54      0.63      6037

    accuracy                           0.85     25530
   macro avg       0.81      0.74      0.77     25530
weighted avg       0.84      0.85      0.84     25530

In [92]:
r={"RandomForestClassifier":84.833529}
Results.append(r)
Results
Out[92]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529}]
In [93]:
# Create the confusion matrix
from sklearn.metrics import ConfusionMatrixDisplay
cm = confusion_matrix(y_test, y_pred)

ConfusionMatrixDisplay(confusion_matrix=cm).plot();
In [94]:
# Export the first three decision trees from the forest
from sklearn.tree import export_graphviz
from IPython.display import Image
import graphviz
for i in range(3):
    tree = rf.estimators_[i]
    dot_data = export_graphviz(tree,
                               feature_names=X_train.columns,  
                               filled=True,  
                               max_depth=2, 
                               impurity=False, 
                               proportion=True)
    graph = graphviz.Source(dot_data)
    display(graph)
Tree 0 Humidity3pm <= 69.5 samples = 100.0% value = [0.764, 0.236] 1 WindDir9am <= 2.5 samples = 80.7% value = [0.855, 0.145] 0->1 True 9392 Pressure3pm <= 1013.75 samples = 19.3% value = [0.38, 0.62] 0->9392 False 2 Cloud9am <= 5.5 samples = 16.1% value = [0.902, 0.098] 1->2 1467 Cloud9am <= 3.5 samples = 64.6% value = [0.843, 0.157] 1->1467 3 (...) 2->3 718 (...) 2->718 1468 (...) 1467->1468 3833 (...) 1467->3833 9393 WindGustSpeed <= 40.0 samples = 8.8% value = [0.252, 0.748] 9392->9393 10780 RainToday <= 0.5 samples = 10.4% value = [0.488, 0.512] 9392->10780 9394 (...) 9393->9394 10123 (...) 9393->10123 10781 (...) 10780->10781 11958 (...) 10780->11958
Tree 0 Rainfall <= 0.75 samples = 100.0% value = [0.761, 0.239] 1 Cloud9am <= 5.5 samples = 73.6% value = [0.839, 0.161] 0->1 True 8548 TempDiff <= 7.45 samples = 26.4% value = [0.544, 0.456] 0->8548 False 2 Cloud3pm <= 5.5 samples = 45.8% value = [0.899, 0.101] 1->2 4131 Sunshine <= 4.15 samples = 27.8% value = [0.741, 0.259] 1->4131 3 (...) 2->3 2442 (...) 2->2442 4132 (...) 4131->4132 5659 (...) 4131->5659 8549 Humidity3pm <= 75.5 samples = 14.8% value = [0.431, 0.569] 8548->8549 11142 Cloud3pm <= 6.5 samples = 11.6% value = [0.691, 0.309] 8548->11142 8550 (...) 8549->8550 10279 (...) 8549->10279 11143 (...) 11142->11143 12234 (...) 11142->12234
Tree 0 Cloud3pm <= 6.5 samples = 100.0% value = [0.76, 0.24] 1 TempDiff <= 9.15 samples = 64.1% value = [0.873, 0.127] 0->1 True 6550 WindDir3pm <= 7.5 samples = 35.9% value = [0.558, 0.442] 0->6550 False 2 Humidity3pm <= 63.5 samples = 19.6% value = [0.764, 0.236] 1->2 3085 Sunshine <= 9.55 samples = 44.5% value = [0.921, 0.079] 1->3085 3 (...) 2->3 1704 (...) 2->1704 3086 (...) 3085->3086 5545 (...) 3085->5545 6551 RainToday <= 0.5 samples = 15.6% value = [0.526, 0.474] 6550->6551 9422 RainToday <= 0.5 samples = 20.3% value = [0.583, 0.417] 6550->9422 6552 (...) 6551->6552 8569 (...) 6551->8569 9423 (...) 9422->9423 11452 (...) 9422->11452
In [95]:
# Organizing feature names and importances in a DataFrame
features_df = pd.DataFrame({'features': rf.feature_names_in_, 'importances': rf.feature_importances_ })

# Sorting data from highest to lowest
features_df_sorted = features_df.sort_values(by='importances', ascending=False)

# Barplot of the result without borders and axis lines
g = sns.barplot(data=features_df_sorted, x='importances', y ='features', palette="rocket")
sns.despine(bottom = True, left = True)
g.set_title('Feature importances')
g.set(xlabel=None)
g.set(ylabel=None)
g.set(xticks=[])
for value in g.containers:
    g.bar_label(value, padding=2)

KNeighborsClassifier¶

In [96]:
knn = KNeighborsClassifier()
knn.fit(X_train, y_train)
knn_head = knn.predict(X_test)
print(f"""accuracy_score: {accuracy_score(knn_head, y_test)}
roc_auc_score: {roc_auc_score(knn_head, y_test)}""")
accuracy_score: 0.8278104191147669
roc_auc_score: 0.7732275541031326
In [97]:
def found_good_neighbors_1(n, p):
    knn = KNeighborsClassifier(n_neighbors=n, p=p, 
                               metric='minkowski')
    knn.fit(X_train, y_train)
    return knn.score(X_test, y_test)

def found_goot_depth(n, criterion_):
    tree = DecisionTreeClassifier(max_depth=n, 
                                  criterion=criterion_,
                                  random_state=42)
    tree.fit(X_train, y_train)
    return tree.score(X_test, y_test)
In [98]:
knn_1 = [found_good_neighbors_1(n, 1) for n in range(1, 22, 2)]
knn_2 = [found_good_neighbors_1(n, 2) for n in range(1, 22, 2)]
In [99]:
tree_gini = [found_goot_depth(n, 'gini') for n in range(1, 22, 2)]
tree_entropy = [found_goot_depth(n, 'entropy') for n in range(1, 22, 2)]
In [100]:
len(knn_1)
Out[100]:
11
In [101]:
l=knn_1[0]
a=1
for i in range(0,10):
    if (knn_1[i]>l):
        l=knn_1[i]
        if(i==0):
            a=1
        else:
            a=(i*2)+1
            
       
print (l,a)
0.836153544849197 9
In [102]:
plt.figure(figsize=(12, 7))
plt.subplot(2, 2, 1)
plt.plot(tree_gini)
plt.title('tree_gini')
plt.legend(['score'])
plt.subplot(2, 2, 2)
plt.plot(tree_entropy)
plt.title('tree_entropy')
plt.legend(['score'])
plt.subplot(2, 2, 3)
plt.plot(knn_1)
plt.title('knn_1')
plt.legend(['score'])
plt.subplot(2, 2, 4)
plt.plot(knn_2)
plt.title('knn_2')
plt.legend(['score'])
plt.show()
In [103]:
print(f"""
tree_gini: {max(tree_gini)}
tree_entropy: {max(tree_entropy)}
knn_1: {max(knn_1)}
knn_2: {max(knn_2)}
""")
tree_gini: 0.836662749706228
tree_entropy: 0.8303172737955347
knn_1: 0.836153544849197
knn_2: 0.8329808068938503

As we can see the decisive trees begin to fall at a depth of 4-5. What we cannot say about the nearest-neighbor method. I think we should still do tests starting from 20 to 50 in increments of 3 for nearest neighbours

In [104]:
knn_1 = [found_good_neighbors_1(n, 1) for n in range(20, 51, 3)]
knn_2 = [found_good_neighbors_1(n, 2) for n in range(20, 51, 3)]
In [105]:
plt.figure(figsize=(14, 9))
plt.subplot(2,2,1)
plt.plot(knn_1)
plt.title('knn_1')
plt.legend(['score'])
plt.subplot(2, 2, 2)
plt.plot(knn_2)
plt.title('knn_2')
plt.legend(['score'])
plt.show()
In [106]:
print(f"""
knn_1: {max(knn_1)}
knn_2: {max(knn_2)}
""")
knn_1: 0.8348217783000391
knn_2: 0.8318448883666275

In [107]:
#knn_1: 0.836153544849197 at k=9
knn = KNeighborsClassifier(n_neighbors=9, p=1, metric='minkowski')
knn.fit(X_train, y_train)
knn_head = knn.predict(X_test)
print(f"""accuracy_score: {accuracy_score(knn_head, y_test)}
roc_auc_score: {roc_auc_score(knn_head, y_test)}""")
print('Mean Absolute Error:', mean_absolute_error(y_test, knn_head))
print('Mean Squared Error:', mean_squared_error(y_test, knn_head))
print('Root Mean Squared Error:', np.sqrt(mean_squared_error(y_test, knn_head)))
accuracy_score: 0.836153544849197
roc_auc_score: 0.7987424699609573
Mean Absolute Error: 0.16384645515080298
Mean Squared Error: 0.16384645515080298
Root Mean Squared Error: 0.40477951424300485
In [108]:
r={"KNeighborsClassifier":83.6153544}
Results.append(r)
Results
Out[108]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529},
 {'KNeighborsClassifier': 83.6153544}]
In [109]:
# Evaluate Model
# Create the confusion matrix
#from sklearn.metrics import ConfusionMatrixDisplay
cm = confusion_matrix(y_test, knn_head)

ConfusionMatrixDisplay(confusion_matrix=cm).plot();
In [110]:
print(classification_report(y_test,knn_head))
              precision    recall  f1-score   support

           0       0.85      0.95      0.90     19493
           1       0.75      0.47      0.57      6037

    accuracy                           0.84     25530
   macro avg       0.80      0.71      0.74     25530
weighted avg       0.83      0.84      0.82     25530

Gaussian Naive Bayes¶

In [111]:
from sklearn.naive_bayes import GaussianNB
gnb = GaussianNB()
gnb.fit(X_train, y_train)
y_pred = gnb.predict(X_test)
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)
report = classification_report(y_test, y_pred)
print("Classification Report:\n", report)
Accuracy: 0.7878965922444183
Classification Report:
               precision    recall  f1-score   support

           0       0.89      0.82      0.86     19493
           1       0.54      0.68      0.60      6037

    accuracy                           0.79     25530
   macro avg       0.72      0.75      0.73     25530
weighted avg       0.81      0.79      0.80     25530

In [112]:
from sklearn.metrics import roc_auc_score
ROC_AUC = roc_auc_score(y_test, y_pred)
# calculate cross-validated ROC AUC 
from sklearn.model_selection import cross_val_score
Cross_validated_ROC_AUC = cross_val_score(gnb, X_train, y_train, cv=5, scoring='roc_auc').mean()
print('ROC AUC : {:.4f}'.format(ROC_AUC))
print('Cross validated ROC AUC : {:.4f}'.format(Cross_validated_ROC_AUC))
ROC AUC : 0.7496
Cross validated ROC AUC : 0.8333
In [113]:
r={"Gaussian Naive Bayes":78.7896592}
Results.append(r)
Results
Out[113]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529},
 {'KNeighborsClassifier': 83.6153544},
 {'Gaussian Naive Bayes': 78.7896592}]
In [114]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();

Gradient Boosting Classifier¶

In [115]:
from sklearn.ensemble import GradientBoostingClassifier
gbm_model = GradientBoostingClassifier().fit(X_train, y_train)
y_pred = gbm_model.predict(X_test)
accuracy_score(y_test, y_pred)
Out[115]:
0.845554249902076
In [116]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.87      0.94      0.90     19493
           1       0.74      0.54      0.62      6037

    accuracy                           0.85     25530
   macro avg       0.80      0.74      0.76     25530
weighted avg       0.84      0.85      0.84     25530

In [117]:
r={"Gradient Boosting Classifier":84.55542499}
Results.append(r)
Results
Out[117]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529},
 {'KNeighborsClassifier': 83.6153544},
 {'Gaussian Naive Bayes': 78.7896592},
 {'Gradient Boosting Classifier': 84.55542499}]
In [118]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();

LightGBM¶

In [119]:
from lightgbm import LGBMClassifier
lgbm_model = LGBMClassifier().fit(X_train, y_train)
y_pred = lgbm_model.predict(X_test)
accuracy_score(y_test, y_pred)
Out[119]:
0.8521739130434782
In [120]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.88      0.94      0.91     19493
           1       0.75      0.57      0.65      6037

    accuracy                           0.85     25530
   macro avg       0.81      0.75      0.78     25530
weighted avg       0.84      0.85      0.84     25530

In [121]:
print('Training accuracy {:.4f}'.format(lgbm_model.score(X_train,y_train)))
print('Testing accuracy {:.4f}'.format(lgbm_model.score(X_test,y_test)))
Training accuracy 0.8680
Testing accuracy 0.8522
In [122]:
# As we can clearly see that there is absolutely no significant difference between both the accuracies and hence the model has made an estimation that is quite accurate.
In [123]:
r={"LightGBM":85.217391304}
Results.append(r)
Results
Out[123]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529},
 {'KNeighborsClassifier': 83.6153544},
 {'Gaussian Naive Bayes': 78.7896592},
 {'Gradient Boosting Classifier': 84.55542499},
 {'LightGBM': 85.217391304}]
In [124]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [125]:
import lightgbm as lgb
lgb.plot_importance(lgbm_model)
Out[125]:
<Axes: title={'center': 'Feature importance'}, xlabel='Feature importance', ylabel='Features'>
In [126]:
lgb.plot_tree(lgbm_model,figsize=(30,40))
Out[126]:
<Axes: >

Catboost¶

In [127]:
from catboost import CatBoostClassifier, Pool
cat = CatBoostClassifier()
cat.fit(X_train, y_train)
y_pred = cat.predict(X_test)
cat_finalscore = accuracy_score(y_test, y_pred)
Learning rate set to 0.059002
0:	learn: 0.6505008	total: 66.1ms	remaining: 1m 5s
1:	learn: 0.6146791	total: 70.6ms	remaining: 35.2s
2:	learn: 0.5843869	total: 75.4ms	remaining: 25s
3:	learn: 0.5553159	total: 79.7ms	remaining: 19.8s
4:	learn: 0.5327716	total: 84.7ms	remaining: 16.8s
5:	learn: 0.5102180	total: 89.3ms	remaining: 14.8s
6:	learn: 0.4939631	total: 94.2ms	remaining: 13.4s
7:	learn: 0.4801131	total: 98.8ms	remaining: 12.2s
8:	learn: 0.4677758	total: 103ms	remaining: 11.4s
9:	learn: 0.4556126	total: 108ms	remaining: 10.7s
10:	learn: 0.4460362	total: 113ms	remaining: 10.1s
11:	learn: 0.4371495	total: 117ms	remaining: 9.64s
12:	learn: 0.4276218	total: 122ms	remaining: 9.27s
13:	learn: 0.4203379	total: 127ms	remaining: 8.96s
14:	learn: 0.4151170	total: 132ms	remaining: 8.67s
15:	learn: 0.4099199	total: 136ms	remaining: 8.39s
16:	learn: 0.4054488	total: 141ms	remaining: 8.15s
17:	learn: 0.4007453	total: 145ms	remaining: 7.93s
18:	learn: 0.3970654	total: 150ms	remaining: 7.74s
19:	learn: 0.3931493	total: 154ms	remaining: 7.56s
20:	learn: 0.3901535	total: 160ms	remaining: 7.45s
21:	learn: 0.3873234	total: 165ms	remaining: 7.31s
22:	learn: 0.3848611	total: 170ms	remaining: 7.22s
23:	learn: 0.3827379	total: 175ms	remaining: 7.1s
24:	learn: 0.3803551	total: 180ms	remaining: 7.01s
25:	learn: 0.3787053	total: 185ms	remaining: 6.92s
26:	learn: 0.3766707	total: 189ms	remaining: 6.82s
27:	learn: 0.3753424	total: 194ms	remaining: 6.75s
28:	learn: 0.3738694	total: 199ms	remaining: 6.67s
29:	learn: 0.3722794	total: 204ms	remaining: 6.6s
30:	learn: 0.3710407	total: 210ms	remaining: 6.56s
31:	learn: 0.3699601	total: 214ms	remaining: 6.48s
32:	learn: 0.3689302	total: 220ms	remaining: 6.44s
33:	learn: 0.3680947	total: 225ms	remaining: 6.41s
34:	learn: 0.3667251	total: 231ms	remaining: 6.37s
35:	learn: 0.3657426	total: 236ms	remaining: 6.31s
36:	learn: 0.3647618	total: 241ms	remaining: 6.27s
37:	learn: 0.3638282	total: 246ms	remaining: 6.22s
38:	learn: 0.3632246	total: 250ms	remaining: 6.17s
39:	learn: 0.3624368	total: 255ms	remaining: 6.13s
40:	learn: 0.3614399	total: 261ms	remaining: 6.11s
41:	learn: 0.3607242	total: 267ms	remaining: 6.08s
42:	learn: 0.3601669	total: 272ms	remaining: 6.05s
43:	learn: 0.3597338	total: 276ms	remaining: 6s
44:	learn: 0.3591940	total: 281ms	remaining: 5.96s
45:	learn: 0.3588095	total: 286ms	remaining: 5.93s
46:	learn: 0.3582100	total: 292ms	remaining: 5.92s
47:	learn: 0.3577731	total: 297ms	remaining: 5.88s
48:	learn: 0.3573515	total: 301ms	remaining: 5.84s
49:	learn: 0.3570747	total: 306ms	remaining: 5.81s
50:	learn: 0.3565890	total: 311ms	remaining: 5.79s
51:	learn: 0.3561239	total: 316ms	remaining: 5.75s
52:	learn: 0.3557093	total: 321ms	remaining: 5.73s
53:	learn: 0.3552320	total: 325ms	remaining: 5.7s
54:	learn: 0.3548996	total: 331ms	remaining: 5.68s
55:	learn: 0.3544845	total: 335ms	remaining: 5.65s
56:	learn: 0.3541271	total: 340ms	remaining: 5.62s
57:	learn: 0.3537175	total: 345ms	remaining: 5.6s
58:	learn: 0.3534734	total: 349ms	remaining: 5.56s
59:	learn: 0.3531288	total: 354ms	remaining: 5.54s
60:	learn: 0.3528946	total: 358ms	remaining: 5.52s
61:	learn: 0.3525818	total: 363ms	remaining: 5.5s
62:	learn: 0.3522682	total: 368ms	remaining: 5.47s
63:	learn: 0.3520153	total: 373ms	remaining: 5.46s
64:	learn: 0.3517507	total: 378ms	remaining: 5.44s
65:	learn: 0.3513761	total: 383ms	remaining: 5.42s
66:	learn: 0.3510648	total: 389ms	remaining: 5.41s
67:	learn: 0.3508586	total: 393ms	remaining: 5.39s
68:	learn: 0.3506713	total: 398ms	remaining: 5.37s
69:	learn: 0.3504968	total: 403ms	remaining: 5.35s
70:	learn: 0.3503029	total: 408ms	remaining: 5.33s
71:	learn: 0.3500931	total: 412ms	remaining: 5.31s
72:	learn: 0.3498518	total: 417ms	remaining: 5.29s
73:	learn: 0.3496935	total: 421ms	remaining: 5.27s
74:	learn: 0.3493258	total: 426ms	remaining: 5.25s
75:	learn: 0.3489786	total: 430ms	remaining: 5.23s
76:	learn: 0.3487066	total: 435ms	remaining: 5.21s
77:	learn: 0.3485038	total: 439ms	remaining: 5.19s
78:	learn: 0.3482787	total: 444ms	remaining: 5.18s
79:	learn: 0.3480481	total: 449ms	remaining: 5.17s
80:	learn: 0.3478101	total: 454ms	remaining: 5.15s
81:	learn: 0.3476173	total: 459ms	remaining: 5.14s
82:	learn: 0.3473700	total: 464ms	remaining: 5.13s
83:	learn: 0.3471264	total: 469ms	remaining: 5.12s
84:	learn: 0.3469440	total: 474ms	remaining: 5.1s
85:	learn: 0.3465689	total: 479ms	remaining: 5.08s
86:	learn: 0.3463584	total: 483ms	remaining: 5.07s
87:	learn: 0.3461173	total: 488ms	remaining: 5.06s
88:	learn: 0.3459150	total: 493ms	remaining: 5.05s
89:	learn: 0.3457367	total: 498ms	remaining: 5.03s
90:	learn: 0.3453602	total: 503ms	remaining: 5.02s
91:	learn: 0.3451816	total: 507ms	remaining: 5.01s
92:	learn: 0.3450720	total: 512ms	remaining: 4.99s
93:	learn: 0.3449072	total: 517ms	remaining: 4.98s
94:	learn: 0.3447527	total: 523ms	remaining: 4.98s
95:	learn: 0.3446216	total: 527ms	remaining: 4.97s
96:	learn: 0.3445006	total: 532ms	remaining: 4.95s
97:	learn: 0.3442363	total: 537ms	remaining: 4.94s
98:	learn: 0.3440931	total: 541ms	remaining: 4.92s
99:	learn: 0.3438826	total: 546ms	remaining: 4.91s
100:	learn: 0.3436331	total: 551ms	remaining: 4.9s
101:	learn: 0.3434886	total: 555ms	remaining: 4.89s
102:	learn: 0.3433510	total: 560ms	remaining: 4.87s
103:	learn: 0.3431080	total: 565ms	remaining: 4.86s
104:	learn: 0.3429792	total: 569ms	remaining: 4.85s
105:	learn: 0.3428181	total: 574ms	remaining: 4.84s
106:	learn: 0.3427074	total: 579ms	remaining: 4.83s
107:	learn: 0.3425258	total: 584ms	remaining: 4.82s
108:	learn: 0.3423537	total: 589ms	remaining: 4.81s
109:	learn: 0.3421398	total: 593ms	remaining: 4.8s
110:	learn: 0.3419901	total: 597ms	remaining: 4.78s
111:	learn: 0.3417227	total: 602ms	remaining: 4.77s
112:	learn: 0.3415648	total: 607ms	remaining: 4.76s
113:	learn: 0.3414713	total: 612ms	remaining: 4.75s
114:	learn: 0.3413555	total: 617ms	remaining: 4.75s
115:	learn: 0.3410340	total: 622ms	remaining: 4.74s
116:	learn: 0.3408432	total: 627ms	remaining: 4.73s
117:	learn: 0.3407284	total: 632ms	remaining: 4.72s
118:	learn: 0.3406084	total: 636ms	remaining: 4.71s
119:	learn: 0.3404721	total: 641ms	remaining: 4.7s
120:	learn: 0.3403506	total: 646ms	remaining: 4.69s
121:	learn: 0.3401737	total: 650ms	remaining: 4.68s
122:	learn: 0.3398748	total: 655ms	remaining: 4.67s
123:	learn: 0.3398222	total: 660ms	remaining: 4.66s
124:	learn: 0.3395874	total: 664ms	remaining: 4.65s
125:	learn: 0.3394411	total: 670ms	remaining: 4.65s
126:	learn: 0.3393218	total: 675ms	remaining: 4.64s
127:	learn: 0.3391488	total: 680ms	remaining: 4.63s
128:	learn: 0.3389730	total: 686ms	remaining: 4.63s
129:	learn: 0.3388832	total: 691ms	remaining: 4.62s
130:	learn: 0.3387186	total: 696ms	remaining: 4.62s
131:	learn: 0.3385835	total: 701ms	remaining: 4.61s
132:	learn: 0.3384921	total: 706ms	remaining: 4.6s
133:	learn: 0.3383895	total: 711ms	remaining: 4.59s
134:	learn: 0.3382870	total: 716ms	remaining: 4.59s
135:	learn: 0.3381739	total: 722ms	remaining: 4.58s
136:	learn: 0.3379299	total: 727ms	remaining: 4.58s
137:	learn: 0.3376915	total: 732ms	remaining: 4.57s
138:	learn: 0.3374406	total: 737ms	remaining: 4.56s
139:	learn: 0.3371457	total: 742ms	remaining: 4.55s
140:	learn: 0.3370319	total: 747ms	remaining: 4.55s
141:	learn: 0.3369207	total: 752ms	remaining: 4.55s
142:	learn: 0.3368691	total: 757ms	remaining: 4.54s
143:	learn: 0.3367036	total: 762ms	remaining: 4.53s
144:	learn: 0.3365939	total: 768ms	remaining: 4.53s
145:	learn: 0.3364444	total: 773ms	remaining: 4.52s
146:	learn: 0.3363395	total: 778ms	remaining: 4.51s
147:	learn: 0.3362135	total: 783ms	remaining: 4.51s
148:	learn: 0.3360377	total: 788ms	remaining: 4.5s
149:	learn: 0.3358827	total: 794ms	remaining: 4.5s
150:	learn: 0.3357530	total: 799ms	remaining: 4.49s
151:	learn: 0.3356160	total: 804ms	remaining: 4.48s
152:	learn: 0.3355012	total: 809ms	remaining: 4.48s
153:	learn: 0.3353675	total: 813ms	remaining: 4.47s
154:	learn: 0.3352524	total: 818ms	remaining: 4.46s
155:	learn: 0.3351362	total: 823ms	remaining: 4.45s
156:	learn: 0.3350352	total: 829ms	remaining: 4.45s
157:	learn: 0.3349147	total: 834ms	remaining: 4.44s
158:	learn: 0.3347802	total: 838ms	remaining: 4.43s
159:	learn: 0.3346367	total: 843ms	remaining: 4.43s
160:	learn: 0.3344997	total: 849ms	remaining: 4.42s
161:	learn: 0.3343277	total: 854ms	remaining: 4.42s
162:	learn: 0.3342126	total: 859ms	remaining: 4.41s
163:	learn: 0.3340589	total: 864ms	remaining: 4.4s
164:	learn: 0.3339071	total: 869ms	remaining: 4.4s
165:	learn: 0.3337936	total: 874ms	remaining: 4.39s
166:	learn: 0.3337017	total: 879ms	remaining: 4.38s
167:	learn: 0.3335236	total: 884ms	remaining: 4.38s
168:	learn: 0.3333641	total: 889ms	remaining: 4.37s
169:	learn: 0.3332553	total: 894ms	remaining: 4.36s
170:	learn: 0.3331506	total: 899ms	remaining: 4.36s
171:	learn: 0.3330365	total: 904ms	remaining: 4.35s
172:	learn: 0.3328458	total: 909ms	remaining: 4.35s
173:	learn: 0.3327296	total: 915ms	remaining: 4.34s
174:	learn: 0.3326041	total: 920ms	remaining: 4.34s
175:	learn: 0.3325227	total: 925ms	remaining: 4.33s
176:	learn: 0.3323798	total: 929ms	remaining: 4.32s
177:	learn: 0.3322585	total: 934ms	remaining: 4.31s
178:	learn: 0.3321584	total: 940ms	remaining: 4.31s
179:	learn: 0.3320557	total: 945ms	remaining: 4.3s
180:	learn: 0.3319075	total: 949ms	remaining: 4.29s
181:	learn: 0.3318176	total: 955ms	remaining: 4.29s
182:	learn: 0.3316372	total: 959ms	remaining: 4.28s
183:	learn: 0.3315283	total: 964ms	remaining: 4.28s
184:	learn: 0.3313740	total: 969ms	remaining: 4.27s
185:	learn: 0.3311973	total: 974ms	remaining: 4.26s
186:	learn: 0.3310414	total: 979ms	remaining: 4.26s
187:	learn: 0.3309351	total: 984ms	remaining: 4.25s
188:	learn: 0.3308198	total: 989ms	remaining: 4.24s
189:	learn: 0.3307328	total: 994ms	remaining: 4.24s
190:	learn: 0.3304975	total: 999ms	remaining: 4.23s
191:	learn: 0.3303708	total: 1s	remaining: 4.22s
192:	learn: 0.3302315	total: 1.01s	remaining: 4.22s
193:	learn: 0.3301040	total: 1.01s	remaining: 4.21s
194:	learn: 0.3299953	total: 1.02s	remaining: 4.21s
195:	learn: 0.3298876	total: 1.02s	remaining: 4.2s
196:	learn: 0.3297674	total: 1.03s	remaining: 4.2s
197:	learn: 0.3296668	total: 1.03s	remaining: 4.19s
198:	learn: 0.3295296	total: 1.04s	remaining: 4.18s
199:	learn: 0.3294036	total: 1.04s	remaining: 4.18s
200:	learn: 0.3292890	total: 1.05s	remaining: 4.17s
201:	learn: 0.3291492	total: 1.05s	remaining: 4.16s
202:	learn: 0.3290260	total: 1.06s	remaining: 4.16s
203:	learn: 0.3289261	total: 1.06s	remaining: 4.15s
204:	learn: 0.3287506	total: 1.07s	remaining: 4.14s
205:	learn: 0.3286042	total: 1.07s	remaining: 4.14s
206:	learn: 0.3285030	total: 1.08s	remaining: 4.13s
207:	learn: 0.3283824	total: 1.08s	remaining: 4.13s
208:	learn: 0.3283074	total: 1.09s	remaining: 4.12s
209:	learn: 0.3282169	total: 1.09s	remaining: 4.11s
210:	learn: 0.3281096	total: 1.1s	remaining: 4.11s
211:	learn: 0.3279382	total: 1.1s	remaining: 4.1s
212:	learn: 0.3277859	total: 1.11s	remaining: 4.1s
213:	learn: 0.3276537	total: 1.11s	remaining: 4.09s
214:	learn: 0.3274102	total: 1.12s	remaining: 4.08s
215:	learn: 0.3272574	total: 1.12s	remaining: 4.08s
216:	learn: 0.3271671	total: 1.13s	remaining: 4.07s
217:	learn: 0.3270706	total: 1.13s	remaining: 4.07s
218:	learn: 0.3269672	total: 1.14s	remaining: 4.06s
219:	learn: 0.3268339	total: 1.14s	remaining: 4.06s
220:	learn: 0.3267156	total: 1.15s	remaining: 4.05s
221:	learn: 0.3265533	total: 1.16s	remaining: 4.05s
222:	learn: 0.3264054	total: 1.16s	remaining: 4.04s
223:	learn: 0.3262648	total: 1.17s	remaining: 4.04s
224:	learn: 0.3261809	total: 1.17s	remaining: 4.03s
225:	learn: 0.3260720	total: 1.18s	remaining: 4.03s
226:	learn: 0.3259198	total: 1.18s	remaining: 4.02s
227:	learn: 0.3258209	total: 1.19s	remaining: 4.02s
228:	learn: 0.3256379	total: 1.19s	remaining: 4.01s
229:	learn: 0.3255138	total: 1.2s	remaining: 4s
230:	learn: 0.3253783	total: 1.2s	remaining: 4s
231:	learn: 0.3252465	total: 1.21s	remaining: 3.99s
232:	learn: 0.3250389	total: 1.21s	remaining: 3.99s
233:	learn: 0.3249089	total: 1.22s	remaining: 3.98s
234:	learn: 0.3247541	total: 1.22s	remaining: 3.98s
235:	learn: 0.3246211	total: 1.23s	remaining: 3.97s
236:	learn: 0.3244800	total: 1.23s	remaining: 3.96s
237:	learn: 0.3243438	total: 1.24s	remaining: 3.96s
238:	learn: 0.3242228	total: 1.24s	remaining: 3.95s
239:	learn: 0.3240254	total: 1.25s	remaining: 3.95s
240:	learn: 0.3239220	total: 1.25s	remaining: 3.94s
241:	learn: 0.3237814	total: 1.26s	remaining: 3.94s
242:	learn: 0.3236698	total: 1.26s	remaining: 3.93s
243:	learn: 0.3235482	total: 1.27s	remaining: 3.93s
244:	learn: 0.3234100	total: 1.27s	remaining: 3.92s
245:	learn: 0.3232642	total: 1.28s	remaining: 3.92s
246:	learn: 0.3231631	total: 1.28s	remaining: 3.91s
247:	learn: 0.3230359	total: 1.29s	remaining: 3.9s
248:	learn: 0.3228518	total: 1.29s	remaining: 3.9s
249:	learn: 0.3226914	total: 1.3s	remaining: 3.89s
250:	learn: 0.3226061	total: 1.3s	remaining: 3.88s
251:	learn: 0.3224859	total: 1.31s	remaining: 3.88s
252:	learn: 0.3223449	total: 1.31s	remaining: 3.87s
253:	learn: 0.3222486	total: 1.32s	remaining: 3.87s
254:	learn: 0.3221150	total: 1.32s	remaining: 3.86s
255:	learn: 0.3219597	total: 1.33s	remaining: 3.86s
256:	learn: 0.3218686	total: 1.33s	remaining: 3.85s
257:	learn: 0.3217019	total: 1.34s	remaining: 3.85s
258:	learn: 0.3216055	total: 1.34s	remaining: 3.84s
259:	learn: 0.3215116	total: 1.35s	remaining: 3.83s
260:	learn: 0.3213711	total: 1.35s	remaining: 3.83s
261:	learn: 0.3212400	total: 1.36s	remaining: 3.83s
262:	learn: 0.3211516	total: 1.36s	remaining: 3.82s
263:	learn: 0.3210227	total: 1.37s	remaining: 3.81s
264:	learn: 0.3209417	total: 1.37s	remaining: 3.81s
265:	learn: 0.3208124	total: 1.38s	remaining: 3.8s
266:	learn: 0.3207106	total: 1.38s	remaining: 3.8s
267:	learn: 0.3205906	total: 1.39s	remaining: 3.79s
268:	learn: 0.3204573	total: 1.39s	remaining: 3.79s
269:	learn: 0.3203376	total: 1.4s	remaining: 3.78s
270:	learn: 0.3202389	total: 1.4s	remaining: 3.78s
271:	learn: 0.3201144	total: 1.41s	remaining: 3.77s
272:	learn: 0.3200151	total: 1.41s	remaining: 3.77s
273:	learn: 0.3199090	total: 1.42s	remaining: 3.76s
274:	learn: 0.3198419	total: 1.42s	remaining: 3.76s
275:	learn: 0.3197188	total: 1.43s	remaining: 3.75s
276:	learn: 0.3196247	total: 1.43s	remaining: 3.74s
277:	learn: 0.3195215	total: 1.44s	remaining: 3.74s
278:	learn: 0.3194090	total: 1.45s	remaining: 3.74s
279:	learn: 0.3193264	total: 1.45s	remaining: 3.73s
280:	learn: 0.3192059	total: 1.46s	remaining: 3.73s
281:	learn: 0.3190860	total: 1.46s	remaining: 3.72s
282:	learn: 0.3189803	total: 1.47s	remaining: 3.72s
283:	learn: 0.3188960	total: 1.47s	remaining: 3.71s
284:	learn: 0.3188212	total: 1.48s	remaining: 3.71s
285:	learn: 0.3186948	total: 1.48s	remaining: 3.7s
286:	learn: 0.3186171	total: 1.49s	remaining: 3.69s
287:	learn: 0.3184859	total: 1.49s	remaining: 3.69s
288:	learn: 0.3184417	total: 1.5s	remaining: 3.68s
289:	learn: 0.3183227	total: 1.5s	remaining: 3.68s
290:	learn: 0.3181748	total: 1.51s	remaining: 3.67s
291:	learn: 0.3180756	total: 1.51s	remaining: 3.66s
292:	learn: 0.3179651	total: 1.52s	remaining: 3.66s
293:	learn: 0.3178838	total: 1.52s	remaining: 3.66s
294:	learn: 0.3177787	total: 1.53s	remaining: 3.65s
295:	learn: 0.3176325	total: 1.53s	remaining: 3.65s
296:	learn: 0.3174998	total: 1.54s	remaining: 3.64s
297:	learn: 0.3173808	total: 1.54s	remaining: 3.64s
298:	learn: 0.3172947	total: 1.55s	remaining: 3.63s
299:	learn: 0.3171888	total: 1.55s	remaining: 3.63s
300:	learn: 0.3171236	total: 1.56s	remaining: 3.62s
301:	learn: 0.3170329	total: 1.56s	remaining: 3.62s
302:	learn: 0.3168957	total: 1.57s	remaining: 3.61s
303:	learn: 0.3168170	total: 1.57s	remaining: 3.6s
304:	learn: 0.3167141	total: 1.58s	remaining: 3.6s
305:	learn: 0.3165716	total: 1.58s	remaining: 3.6s
306:	learn: 0.3165039	total: 1.59s	remaining: 3.59s
307:	learn: 0.3164063	total: 1.59s	remaining: 3.58s
308:	learn: 0.3163228	total: 1.6s	remaining: 3.58s
309:	learn: 0.3161984	total: 1.6s	remaining: 3.57s
310:	learn: 0.3160979	total: 1.61s	remaining: 3.57s
311:	learn: 0.3160160	total: 1.61s	remaining: 3.56s
312:	learn: 0.3159071	total: 1.62s	remaining: 3.56s
313:	learn: 0.3157867	total: 1.63s	remaining: 3.55s
314:	learn: 0.3156821	total: 1.63s	remaining: 3.55s
315:	learn: 0.3155803	total: 1.64s	remaining: 3.54s
316:	learn: 0.3154647	total: 1.64s	remaining: 3.54s
317:	learn: 0.3153641	total: 1.65s	remaining: 3.53s
318:	learn: 0.3152558	total: 1.65s	remaining: 3.52s
319:	learn: 0.3151424	total: 1.66s	remaining: 3.52s
320:	learn: 0.3150032	total: 1.66s	remaining: 3.51s
321:	learn: 0.3149125	total: 1.67s	remaining: 3.51s
322:	learn: 0.3148085	total: 1.67s	remaining: 3.5s
323:	learn: 0.3146939	total: 1.68s	remaining: 3.5s
324:	learn: 0.3146066	total: 1.68s	remaining: 3.49s
325:	learn: 0.3145225	total: 1.69s	remaining: 3.49s
326:	learn: 0.3144214	total: 1.69s	remaining: 3.48s
327:	learn: 0.3143436	total: 1.7s	remaining: 3.48s
328:	learn: 0.3142361	total: 1.7s	remaining: 3.47s
329:	learn: 0.3141817	total: 1.71s	remaining: 3.47s
330:	learn: 0.3141068	total: 1.71s	remaining: 3.46s
331:	learn: 0.3140432	total: 1.72s	remaining: 3.46s
332:	learn: 0.3139875	total: 1.72s	remaining: 3.45s
333:	learn: 0.3138653	total: 1.73s	remaining: 3.44s
334:	learn: 0.3137583	total: 1.73s	remaining: 3.44s
335:	learn: 0.3136963	total: 1.74s	remaining: 3.43s
336:	learn: 0.3135727	total: 1.74s	remaining: 3.43s
337:	learn: 0.3134735	total: 1.75s	remaining: 3.42s
338:	learn: 0.3134008	total: 1.75s	remaining: 3.41s
339:	learn: 0.3132773	total: 1.76s	remaining: 3.41s
340:	learn: 0.3132247	total: 1.76s	remaining: 3.4s
341:	learn: 0.3131518	total: 1.76s	remaining: 3.4s
342:	learn: 0.3130593	total: 1.77s	remaining: 3.39s
343:	learn: 0.3129788	total: 1.77s	remaining: 3.38s
344:	learn: 0.3128837	total: 1.78s	remaining: 3.38s
345:	learn: 0.3127845	total: 1.78s	remaining: 3.37s
346:	learn: 0.3126971	total: 1.79s	remaining: 3.37s
347:	learn: 0.3126069	total: 1.79s	remaining: 3.36s
348:	learn: 0.3125111	total: 1.8s	remaining: 3.35s
349:	learn: 0.3124533	total: 1.8s	remaining: 3.35s
350:	learn: 0.3123424	total: 1.81s	remaining: 3.34s
351:	learn: 0.3122581	total: 1.81s	remaining: 3.34s
352:	learn: 0.3121748	total: 1.82s	remaining: 3.33s
353:	learn: 0.3120701	total: 1.82s	remaining: 3.33s
354:	learn: 0.3119291	total: 1.83s	remaining: 3.32s
355:	learn: 0.3118347	total: 1.83s	remaining: 3.32s
356:	learn: 0.3117249	total: 1.84s	remaining: 3.31s
357:	learn: 0.3116427	total: 1.84s	remaining: 3.31s
358:	learn: 0.3115232	total: 1.85s	remaining: 3.3s
359:	learn: 0.3114298	total: 1.85s	remaining: 3.29s
360:	learn: 0.3113363	total: 1.86s	remaining: 3.29s
361:	learn: 0.3112324	total: 1.86s	remaining: 3.28s
362:	learn: 0.3111459	total: 1.87s	remaining: 3.28s
363:	learn: 0.3110327	total: 1.87s	remaining: 3.27s
364:	learn: 0.3109479	total: 1.88s	remaining: 3.27s
365:	learn: 0.3108411	total: 1.88s	remaining: 3.26s
366:	learn: 0.3107808	total: 1.89s	remaining: 3.26s
367:	learn: 0.3106720	total: 1.89s	remaining: 3.25s
368:	learn: 0.3105878	total: 1.9s	remaining: 3.25s
369:	learn: 0.3104872	total: 1.9s	remaining: 3.24s
370:	learn: 0.3103806	total: 1.91s	remaining: 3.23s
371:	learn: 0.3102858	total: 1.91s	remaining: 3.23s
372:	learn: 0.3101976	total: 1.92s	remaining: 3.22s
373:	learn: 0.3101362	total: 1.92s	remaining: 3.22s
374:	learn: 0.3100314	total: 1.93s	remaining: 3.21s
375:	learn: 0.3099406	total: 1.93s	remaining: 3.21s
376:	learn: 0.3098504	total: 1.94s	remaining: 3.2s
377:	learn: 0.3097638	total: 1.94s	remaining: 3.19s
378:	learn: 0.3097072	total: 1.95s	remaining: 3.19s
379:	learn: 0.3096140	total: 1.95s	remaining: 3.19s
380:	learn: 0.3095483	total: 1.96s	remaining: 3.18s
381:	learn: 0.3094139	total: 1.96s	remaining: 3.17s
382:	learn: 0.3093346	total: 1.97s	remaining: 3.17s
383:	learn: 0.3092429	total: 1.97s	remaining: 3.16s
384:	learn: 0.3091537	total: 1.98s	remaining: 3.16s
385:	learn: 0.3090671	total: 1.98s	remaining: 3.15s
386:	learn: 0.3089127	total: 1.99s	remaining: 3.15s
387:	learn: 0.3088323	total: 1.99s	remaining: 3.15s
388:	learn: 0.3087433	total: 2s	remaining: 3.14s
389:	learn: 0.3086485	total: 2s	remaining: 3.13s
390:	learn: 0.3085390	total: 2.01s	remaining: 3.13s
391:	learn: 0.3084520	total: 2.01s	remaining: 3.12s
392:	learn: 0.3083682	total: 2.02s	remaining: 3.12s
393:	learn: 0.3082596	total: 2.02s	remaining: 3.11s
394:	learn: 0.3081794	total: 2.03s	remaining: 3.11s
395:	learn: 0.3081051	total: 2.03s	remaining: 3.1s
396:	learn: 0.3080318	total: 2.04s	remaining: 3.1s
397:	learn: 0.3079391	total: 2.04s	remaining: 3.09s
398:	learn: 0.3078626	total: 2.05s	remaining: 3.08s
399:	learn: 0.3077771	total: 2.05s	remaining: 3.08s
400:	learn: 0.3077022	total: 2.06s	remaining: 3.08s
401:	learn: 0.3076343	total: 2.06s	remaining: 3.07s
402:	learn: 0.3075612	total: 2.07s	remaining: 3.06s
403:	learn: 0.3074962	total: 2.07s	remaining: 3.06s
404:	learn: 0.3074106	total: 2.08s	remaining: 3.05s
405:	learn: 0.3073206	total: 2.08s	remaining: 3.05s
406:	learn: 0.3072696	total: 2.09s	remaining: 3.04s
407:	learn: 0.3071831	total: 2.09s	remaining: 3.04s
408:	learn: 0.3071537	total: 2.1s	remaining: 3.03s
409:	learn: 0.3070547	total: 2.1s	remaining: 3.03s
410:	learn: 0.3069817	total: 2.11s	remaining: 3.02s
411:	learn: 0.3069261	total: 2.11s	remaining: 3.02s
412:	learn: 0.3068158	total: 2.12s	remaining: 3.01s
413:	learn: 0.3066867	total: 2.12s	remaining: 3s
414:	learn: 0.3065953	total: 2.13s	remaining: 3s
415:	learn: 0.3064975	total: 2.13s	remaining: 2.99s
416:	learn: 0.3064213	total: 2.14s	remaining: 2.99s
417:	learn: 0.3063339	total: 2.14s	remaining: 2.98s
418:	learn: 0.3062265	total: 2.15s	remaining: 2.98s
419:	learn: 0.3061267	total: 2.15s	remaining: 2.97s
420:	learn: 0.3060425	total: 2.16s	remaining: 2.97s
421:	learn: 0.3059447	total: 2.16s	remaining: 2.96s
422:	learn: 0.3058366	total: 2.17s	remaining: 2.96s
423:	learn: 0.3057861	total: 2.17s	remaining: 2.95s
424:	learn: 0.3056758	total: 2.18s	remaining: 2.95s
425:	learn: 0.3055807	total: 2.18s	remaining: 2.94s
426:	learn: 0.3054569	total: 2.19s	remaining: 2.94s
427:	learn: 0.3053621	total: 2.19s	remaining: 2.93s
428:	learn: 0.3053097	total: 2.2s	remaining: 2.93s
429:	learn: 0.3052549	total: 2.2s	remaining: 2.92s
430:	learn: 0.3051584	total: 2.21s	remaining: 2.92s
431:	learn: 0.3050975	total: 2.21s	remaining: 2.91s
432:	learn: 0.3050296	total: 2.22s	remaining: 2.9s
433:	learn: 0.3049412	total: 2.22s	remaining: 2.9s
434:	learn: 0.3048595	total: 2.23s	remaining: 2.89s
435:	learn: 0.3047963	total: 2.23s	remaining: 2.89s
436:	learn: 0.3047189	total: 2.24s	remaining: 2.88s
437:	learn: 0.3046516	total: 2.24s	remaining: 2.88s
438:	learn: 0.3045698	total: 2.25s	remaining: 2.87s
439:	learn: 0.3044942	total: 2.25s	remaining: 2.87s
440:	learn: 0.3043972	total: 2.26s	remaining: 2.86s
441:	learn: 0.3043251	total: 2.26s	remaining: 2.85s
442:	learn: 0.3042285	total: 2.27s	remaining: 2.85s
443:	learn: 0.3041437	total: 2.27s	remaining: 2.85s
444:	learn: 0.3040687	total: 2.28s	remaining: 2.84s
445:	learn: 0.3039562	total: 2.28s	remaining: 2.83s
446:	learn: 0.3038838	total: 2.29s	remaining: 2.83s
447:	learn: 0.3038048	total: 2.29s	remaining: 2.82s
448:	learn: 0.3037546	total: 2.3s	remaining: 2.82s
449:	learn: 0.3036860	total: 2.3s	remaining: 2.81s
450:	learn: 0.3035869	total: 2.31s	remaining: 2.81s
451:	learn: 0.3034740	total: 2.31s	remaining: 2.8s
452:	learn: 0.3034298	total: 2.31s	remaining: 2.8s
453:	learn: 0.3033510	total: 2.32s	remaining: 2.79s
454:	learn: 0.3032689	total: 2.33s	remaining: 2.79s
455:	learn: 0.3031847	total: 2.33s	remaining: 2.78s
456:	learn: 0.3030980	total: 2.34s	remaining: 2.78s
457:	learn: 0.3030409	total: 2.34s	remaining: 2.77s
458:	learn: 0.3029614	total: 2.35s	remaining: 2.77s
459:	learn: 0.3029039	total: 2.35s	remaining: 2.76s
460:	learn: 0.3028439	total: 2.36s	remaining: 2.76s
461:	learn: 0.3027680	total: 2.36s	remaining: 2.75s
462:	learn: 0.3026819	total: 2.37s	remaining: 2.75s
463:	learn: 0.3026097	total: 2.37s	remaining: 2.74s
464:	learn: 0.3025483	total: 2.38s	remaining: 2.73s
465:	learn: 0.3024799	total: 2.38s	remaining: 2.73s
466:	learn: 0.3024249	total: 2.39s	remaining: 2.72s
467:	learn: 0.3023528	total: 2.39s	remaining: 2.72s
468:	learn: 0.3022722	total: 2.4s	remaining: 2.71s
469:	learn: 0.3021773	total: 2.4s	remaining: 2.71s
470:	learn: 0.3020985	total: 2.41s	remaining: 2.7s
471:	learn: 0.3020116	total: 2.41s	remaining: 2.7s
472:	learn: 0.3019337	total: 2.42s	remaining: 2.69s
473:	learn: 0.3018491	total: 2.42s	remaining: 2.69s
474:	learn: 0.3017621	total: 2.43s	remaining: 2.68s
475:	learn: 0.3016931	total: 2.43s	remaining: 2.68s
476:	learn: 0.3016008	total: 2.44s	remaining: 2.67s
477:	learn: 0.3015464	total: 2.44s	remaining: 2.67s
478:	learn: 0.3014532	total: 2.45s	remaining: 2.66s
479:	learn: 0.3013431	total: 2.45s	remaining: 2.66s
480:	learn: 0.3012564	total: 2.46s	remaining: 2.65s
481:	learn: 0.3011806	total: 2.46s	remaining: 2.65s
482:	learn: 0.3011114	total: 2.47s	remaining: 2.64s
483:	learn: 0.3010378	total: 2.47s	remaining: 2.63s
484:	learn: 0.3009851	total: 2.48s	remaining: 2.63s
485:	learn: 0.3009153	total: 2.48s	remaining: 2.63s
486:	learn: 0.3008365	total: 2.49s	remaining: 2.62s
487:	learn: 0.3007260	total: 2.49s	remaining: 2.62s
488:	learn: 0.3006652	total: 2.5s	remaining: 2.61s
489:	learn: 0.3005915	total: 2.5s	remaining: 2.6s
490:	learn: 0.3005142	total: 2.51s	remaining: 2.6s
491:	learn: 0.3004292	total: 2.51s	remaining: 2.59s
492:	learn: 0.3003584	total: 2.52s	remaining: 2.59s
493:	learn: 0.3002779	total: 2.52s	remaining: 2.58s
494:	learn: 0.3002318	total: 2.53s	remaining: 2.58s
495:	learn: 0.3001370	total: 2.53s	remaining: 2.57s
496:	learn: 0.3000432	total: 2.54s	remaining: 2.57s
497:	learn: 0.2999505	total: 2.54s	remaining: 2.56s
498:	learn: 0.2998643	total: 2.55s	remaining: 2.56s
499:	learn: 0.2998090	total: 2.55s	remaining: 2.55s
500:	learn: 0.2997534	total: 2.56s	remaining: 2.55s
501:	learn: 0.2996748	total: 2.56s	remaining: 2.54s
502:	learn: 0.2995726	total: 2.57s	remaining: 2.54s
503:	learn: 0.2995251	total: 2.57s	remaining: 2.53s
504:	learn: 0.2994724	total: 2.58s	remaining: 2.53s
505:	learn: 0.2993666	total: 2.58s	remaining: 2.52s
506:	learn: 0.2992951	total: 2.59s	remaining: 2.52s
507:	learn: 0.2991901	total: 2.59s	remaining: 2.51s
508:	learn: 0.2991333	total: 2.6s	remaining: 2.51s
509:	learn: 0.2990388	total: 2.6s	remaining: 2.5s
510:	learn: 0.2989629	total: 2.61s	remaining: 2.5s
511:	learn: 0.2988923	total: 2.61s	remaining: 2.49s
512:	learn: 0.2988227	total: 2.62s	remaining: 2.48s
513:	learn: 0.2987617	total: 2.62s	remaining: 2.48s
514:	learn: 0.2986877	total: 2.63s	remaining: 2.47s
515:	learn: 0.2986143	total: 2.63s	remaining: 2.47s
516:	learn: 0.2985232	total: 2.64s	remaining: 2.46s
517:	learn: 0.2984662	total: 2.64s	remaining: 2.46s
518:	learn: 0.2983965	total: 2.65s	remaining: 2.45s
519:	learn: 0.2982992	total: 2.65s	remaining: 2.45s
520:	learn: 0.2982302	total: 2.66s	remaining: 2.44s
521:	learn: 0.2981332	total: 2.66s	remaining: 2.44s
522:	learn: 0.2980886	total: 2.67s	remaining: 2.43s
523:	learn: 0.2980268	total: 2.67s	remaining: 2.43s
524:	learn: 0.2979572	total: 2.67s	remaining: 2.42s
525:	learn: 0.2978849	total: 2.68s	remaining: 2.42s
526:	learn: 0.2978241	total: 2.69s	remaining: 2.41s
527:	learn: 0.2977409	total: 2.69s	remaining: 2.41s
528:	learn: 0.2976609	total: 2.7s	remaining: 2.4s
529:	learn: 0.2976089	total: 2.7s	remaining: 2.4s
530:	learn: 0.2975532	total: 2.71s	remaining: 2.39s
531:	learn: 0.2974692	total: 2.71s	remaining: 2.39s
532:	learn: 0.2973643	total: 2.72s	remaining: 2.38s
533:	learn: 0.2973002	total: 2.73s	remaining: 2.38s
534:	learn: 0.2972295	total: 2.73s	remaining: 2.37s
535:	learn: 0.2971898	total: 2.73s	remaining: 2.37s
536:	learn: 0.2971111	total: 2.74s	remaining: 2.36s
537:	learn: 0.2969984	total: 2.75s	remaining: 2.36s
538:	learn: 0.2969440	total: 2.75s	remaining: 2.35s
539:	learn: 0.2968930	total: 2.76s	remaining: 2.35s
540:	learn: 0.2968285	total: 2.76s	remaining: 2.34s
541:	learn: 0.2967461	total: 2.77s	remaining: 2.34s
542:	learn: 0.2966975	total: 2.77s	remaining: 2.33s
543:	learn: 0.2966162	total: 2.77s	remaining: 2.33s
544:	learn: 0.2965467	total: 2.78s	remaining: 2.32s
545:	learn: 0.2964675	total: 2.78s	remaining: 2.31s
546:	learn: 0.2963559	total: 2.79s	remaining: 2.31s
547:	learn: 0.2962569	total: 2.79s	remaining: 2.3s
548:	learn: 0.2961626	total: 2.8s	remaining: 2.3s
549:	learn: 0.2960827	total: 2.8s	remaining: 2.29s
550:	learn: 0.2960259	total: 2.81s	remaining: 2.29s
551:	learn: 0.2959663	total: 2.81s	remaining: 2.28s
552:	learn: 0.2958858	total: 2.82s	remaining: 2.28s
553:	learn: 0.2958063	total: 2.82s	remaining: 2.27s
554:	learn: 0.2957347	total: 2.83s	remaining: 2.27s
555:	learn: 0.2956566	total: 2.83s	remaining: 2.26s
556:	learn: 0.2955976	total: 2.84s	remaining: 2.26s
557:	learn: 0.2955473	total: 2.84s	remaining: 2.25s
558:	learn: 0.2954321	total: 2.85s	remaining: 2.25s
559:	learn: 0.2953565	total: 2.85s	remaining: 2.24s
560:	learn: 0.2952556	total: 2.85s	remaining: 2.23s
561:	learn: 0.2951933	total: 2.86s	remaining: 2.23s
562:	learn: 0.2951272	total: 2.87s	remaining: 2.22s
563:	learn: 0.2950508	total: 2.87s	remaining: 2.22s
564:	learn: 0.2949687	total: 2.88s	remaining: 2.21s
565:	learn: 0.2949056	total: 2.88s	remaining: 2.21s
566:	learn: 0.2948426	total: 2.88s	remaining: 2.2s
567:	learn: 0.2947476	total: 2.89s	remaining: 2.2s
568:	learn: 0.2946742	total: 2.89s	remaining: 2.19s
569:	learn: 0.2946315	total: 2.9s	remaining: 2.19s
570:	learn: 0.2945449	total: 2.9s	remaining: 2.18s
571:	learn: 0.2944516	total: 2.91s	remaining: 2.18s
572:	learn: 0.2944170	total: 2.91s	remaining: 2.17s
573:	learn: 0.2943533	total: 2.92s	remaining: 2.17s
574:	learn: 0.2942994	total: 2.92s	remaining: 2.16s
575:	learn: 0.2942159	total: 2.93s	remaining: 2.15s
576:	learn: 0.2941381	total: 2.93s	remaining: 2.15s
577:	learn: 0.2940818	total: 2.94s	remaining: 2.14s
578:	learn: 0.2939972	total: 2.94s	remaining: 2.14s
579:	learn: 0.2939255	total: 2.95s	remaining: 2.13s
580:	learn: 0.2938601	total: 2.95s	remaining: 2.13s
581:	learn: 0.2937912	total: 2.96s	remaining: 2.12s
582:	learn: 0.2936830	total: 2.96s	remaining: 2.12s
583:	learn: 0.2936143	total: 2.97s	remaining: 2.11s
584:	learn: 0.2935501	total: 2.97s	remaining: 2.11s
585:	learn: 0.2934599	total: 2.98s	remaining: 2.1s
586:	learn: 0.2933877	total: 2.98s	remaining: 2.1s
587:	learn: 0.2933551	total: 2.98s	remaining: 2.09s
588:	learn: 0.2933381	total: 2.99s	remaining: 2.09s
589:	learn: 0.2932616	total: 2.99s	remaining: 2.08s
590:	learn: 0.2932224	total: 3s	remaining: 2.08s
591:	learn: 0.2931451	total: 3s	remaining: 2.07s
592:	learn: 0.2930582	total: 3.01s	remaining: 2.06s
593:	learn: 0.2930233	total: 3.01s	remaining: 2.06s
594:	learn: 0.2929312	total: 3.02s	remaining: 2.05s
595:	learn: 0.2928527	total: 3.02s	remaining: 2.05s
596:	learn: 0.2927651	total: 3.03s	remaining: 2.04s
597:	learn: 0.2927462	total: 3.03s	remaining: 2.04s
598:	learn: 0.2927075	total: 3.04s	remaining: 2.03s
599:	learn: 0.2926137	total: 3.04s	remaining: 2.03s
600:	learn: 0.2925318	total: 3.05s	remaining: 2.02s
601:	learn: 0.2924497	total: 3.05s	remaining: 2.02s
602:	learn: 0.2923421	total: 3.06s	remaining: 2.01s
603:	learn: 0.2922557	total: 3.06s	remaining: 2.01s
604:	learn: 0.2922010	total: 3.07s	remaining: 2s
605:	learn: 0.2921345	total: 3.08s	remaining: 2s
606:	learn: 0.2920926	total: 3.08s	remaining: 1.99s
607:	learn: 0.2920226	total: 3.08s	remaining: 1.99s
608:	learn: 0.2919242	total: 3.09s	remaining: 1.98s
609:	learn: 0.2918441	total: 3.09s	remaining: 1.98s
610:	learn: 0.2917738	total: 3.1s	remaining: 1.97s
611:	learn: 0.2917127	total: 3.1s	remaining: 1.97s
612:	learn: 0.2916414	total: 3.11s	remaining: 1.96s
613:	learn: 0.2915740	total: 3.11s	remaining: 1.96s
614:	learn: 0.2915124	total: 3.12s	remaining: 1.95s
615:	learn: 0.2914196	total: 3.12s	remaining: 1.95s
616:	learn: 0.2913584	total: 3.13s	remaining: 1.94s
617:	learn: 0.2913020	total: 3.13s	remaining: 1.94s
618:	learn: 0.2912024	total: 3.14s	remaining: 1.93s
619:	learn: 0.2911239	total: 3.14s	remaining: 1.93s
620:	learn: 0.2910459	total: 3.15s	remaining: 1.92s
621:	learn: 0.2909469	total: 3.15s	remaining: 1.92s
622:	learn: 0.2908467	total: 3.16s	remaining: 1.91s
623:	learn: 0.2907481	total: 3.16s	remaining: 1.9s
624:	learn: 0.2906433	total: 3.17s	remaining: 1.9s
625:	learn: 0.2905577	total: 3.17s	remaining: 1.89s
626:	learn: 0.2905127	total: 3.18s	remaining: 1.89s
627:	learn: 0.2904399	total: 3.18s	remaining: 1.88s
628:	learn: 0.2903580	total: 3.19s	remaining: 1.88s
629:	learn: 0.2902788	total: 3.19s	remaining: 1.87s
630:	learn: 0.2902098	total: 3.2s	remaining: 1.87s
631:	learn: 0.2901401	total: 3.2s	remaining: 1.86s
632:	learn: 0.2900652	total: 3.21s	remaining: 1.86s
633:	learn: 0.2900242	total: 3.21s	remaining: 1.85s
634:	learn: 0.2899489	total: 3.22s	remaining: 1.85s
635:	learn: 0.2898584	total: 3.22s	remaining: 1.84s
636:	learn: 0.2897922	total: 3.23s	remaining: 1.84s
637:	learn: 0.2897301	total: 3.23s	remaining: 1.83s
638:	learn: 0.2896596	total: 3.24s	remaining: 1.83s
639:	learn: 0.2895775	total: 3.24s	remaining: 1.82s
640:	learn: 0.2895082	total: 3.25s	remaining: 1.82s
641:	learn: 0.2894346	total: 3.25s	remaining: 1.81s
642:	learn: 0.2893559	total: 3.25s	remaining: 1.81s
643:	learn: 0.2892724	total: 3.26s	remaining: 1.8s
644:	learn: 0.2892097	total: 3.26s	remaining: 1.8s
645:	learn: 0.2891654	total: 3.27s	remaining: 1.79s
646:	learn: 0.2890687	total: 3.27s	remaining: 1.79s
647:	learn: 0.2890285	total: 3.28s	remaining: 1.78s
648:	learn: 0.2889769	total: 3.28s	remaining: 1.77s
649:	learn: 0.2888957	total: 3.29s	remaining: 1.77s
650:	learn: 0.2888189	total: 3.29s	remaining: 1.76s
651:	learn: 0.2887541	total: 3.3s	remaining: 1.76s
652:	learn: 0.2886863	total: 3.3s	remaining: 1.75s
653:	learn: 0.2886034	total: 3.31s	remaining: 1.75s
654:	learn: 0.2885385	total: 3.31s	remaining: 1.75s
655:	learn: 0.2884428	total: 3.32s	remaining: 1.74s
656:	learn: 0.2883660	total: 3.32s	remaining: 1.73s
657:	learn: 0.2883192	total: 3.33s	remaining: 1.73s
658:	learn: 0.2882640	total: 3.33s	remaining: 1.72s
659:	learn: 0.2882094	total: 3.34s	remaining: 1.72s
660:	learn: 0.2881481	total: 3.34s	remaining: 1.71s
661:	learn: 0.2881210	total: 3.35s	remaining: 1.71s
662:	learn: 0.2880582	total: 3.35s	remaining: 1.7s
663:	learn: 0.2879943	total: 3.36s	remaining: 1.7s
664:	learn: 0.2879485	total: 3.36s	remaining: 1.69s
665:	learn: 0.2878525	total: 3.37s	remaining: 1.69s
666:	learn: 0.2877958	total: 3.37s	remaining: 1.68s
667:	learn: 0.2877412	total: 3.38s	remaining: 1.68s
668:	learn: 0.2876574	total: 3.38s	remaining: 1.67s
669:	learn: 0.2875641	total: 3.38s	remaining: 1.67s
670:	learn: 0.2875040	total: 3.39s	remaining: 1.66s
671:	learn: 0.2874183	total: 3.4s	remaining: 1.66s
672:	learn: 0.2873528	total: 3.4s	remaining: 1.65s
673:	learn: 0.2872610	total: 3.41s	remaining: 1.65s
674:	learn: 0.2871615	total: 3.41s	remaining: 1.64s
675:	learn: 0.2871049	total: 3.42s	remaining: 1.64s
676:	learn: 0.2870517	total: 3.42s	remaining: 1.63s
677:	learn: 0.2869800	total: 3.43s	remaining: 1.63s
678:	learn: 0.2869039	total: 3.43s	remaining: 1.62s
679:	learn: 0.2868443	total: 3.44s	remaining: 1.62s
680:	learn: 0.2867686	total: 3.44s	remaining: 1.61s
681:	learn: 0.2867116	total: 3.44s	remaining: 1.61s
682:	learn: 0.2866636	total: 3.45s	remaining: 1.6s
683:	learn: 0.2865929	total: 3.46s	remaining: 1.6s
684:	learn: 0.2865243	total: 3.46s	remaining: 1.59s
685:	learn: 0.2864726	total: 3.47s	remaining: 1.59s
686:	learn: 0.2863952	total: 3.47s	remaining: 1.58s
687:	learn: 0.2863235	total: 3.48s	remaining: 1.58s
688:	learn: 0.2862116	total: 3.48s	remaining: 1.57s
689:	learn: 0.2861360	total: 3.49s	remaining: 1.57s
690:	learn: 0.2860810	total: 3.49s	remaining: 1.56s
691:	learn: 0.2860076	total: 3.5s	remaining: 1.56s
692:	learn: 0.2859400	total: 3.5s	remaining: 1.55s
693:	learn: 0.2858933	total: 3.51s	remaining: 1.55s
694:	learn: 0.2858370	total: 3.51s	remaining: 1.54s
695:	learn: 0.2857944	total: 3.52s	remaining: 1.54s
696:	learn: 0.2857132	total: 3.52s	remaining: 1.53s
697:	learn: 0.2857038	total: 3.53s	remaining: 1.53s
698:	learn: 0.2856453	total: 3.53s	remaining: 1.52s
699:	learn: 0.2855912	total: 3.54s	remaining: 1.51s
700:	learn: 0.2854919	total: 3.54s	remaining: 1.51s
701:	learn: 0.2854530	total: 3.55s	remaining: 1.5s
702:	learn: 0.2853927	total: 3.55s	remaining: 1.5s
703:	learn: 0.2853119	total: 3.56s	remaining: 1.5s
704:	learn: 0.2852382	total: 3.56s	remaining: 1.49s
705:	learn: 0.2852033	total: 3.56s	remaining: 1.48s
706:	learn: 0.2851265	total: 3.57s	remaining: 1.48s
707:	learn: 0.2850510	total: 3.57s	remaining: 1.47s
708:	learn: 0.2850105	total: 3.58s	remaining: 1.47s
709:	learn: 0.2849512	total: 3.58s	remaining: 1.46s
710:	learn: 0.2849017	total: 3.59s	remaining: 1.46s
711:	learn: 0.2848146	total: 3.59s	remaining: 1.45s
712:	learn: 0.2847268	total: 3.6s	remaining: 1.45s
713:	learn: 0.2846432	total: 3.6s	remaining: 1.44s
714:	learn: 0.2845836	total: 3.61s	remaining: 1.44s
715:	learn: 0.2845300	total: 3.61s	remaining: 1.43s
716:	learn: 0.2844755	total: 3.62s	remaining: 1.43s
717:	learn: 0.2843984	total: 3.62s	remaining: 1.42s
718:	learn: 0.2843090	total: 3.63s	remaining: 1.42s
719:	learn: 0.2842475	total: 3.63s	remaining: 1.41s
720:	learn: 0.2842320	total: 3.64s	remaining: 1.41s
721:	learn: 0.2841863	total: 3.64s	remaining: 1.4s
722:	learn: 0.2841024	total: 3.65s	remaining: 1.4s
723:	learn: 0.2840465	total: 3.65s	remaining: 1.39s
724:	learn: 0.2839943	total: 3.66s	remaining: 1.39s
725:	learn: 0.2838898	total: 3.66s	remaining: 1.38s
726:	learn: 0.2838110	total: 3.67s	remaining: 1.38s
727:	learn: 0.2837224	total: 3.67s	remaining: 1.37s
728:	learn: 0.2836858	total: 3.68s	remaining: 1.37s
729:	learn: 0.2835848	total: 3.68s	remaining: 1.36s
730:	learn: 0.2835217	total: 3.69s	remaining: 1.36s
731:	learn: 0.2834360	total: 3.69s	remaining: 1.35s
732:	learn: 0.2833511	total: 3.69s	remaining: 1.35s
733:	learn: 0.2833040	total: 3.7s	remaining: 1.34s
734:	learn: 0.2832208	total: 3.71s	remaining: 1.33s
735:	learn: 0.2831371	total: 3.71s	remaining: 1.33s
736:	learn: 0.2830439	total: 3.71s	remaining: 1.32s
737:	learn: 0.2830155	total: 3.72s	remaining: 1.32s
738:	learn: 0.2829932	total: 3.72s	remaining: 1.31s
739:	learn: 0.2829243	total: 3.73s	remaining: 1.31s
740:	learn: 0.2828548	total: 3.73s	remaining: 1.3s
741:	learn: 0.2827935	total: 3.74s	remaining: 1.3s
742:	learn: 0.2827280	total: 3.74s	remaining: 1.29s
743:	learn: 0.2826669	total: 3.75s	remaining: 1.29s
744:	learn: 0.2825957	total: 3.75s	remaining: 1.28s
745:	learn: 0.2825359	total: 3.76s	remaining: 1.28s
746:	learn: 0.2824586	total: 3.76s	remaining: 1.27s
747:	learn: 0.2824110	total: 3.77s	remaining: 1.27s
748:	learn: 0.2823452	total: 3.77s	remaining: 1.26s
749:	learn: 0.2823002	total: 3.78s	remaining: 1.26s
750:	learn: 0.2822604	total: 3.78s	remaining: 1.25s
751:	learn: 0.2822149	total: 3.79s	remaining: 1.25s
752:	learn: 0.2821527	total: 3.79s	remaining: 1.24s
753:	learn: 0.2821142	total: 3.79s	remaining: 1.24s
754:	learn: 0.2820533	total: 3.8s	remaining: 1.23s
755:	learn: 0.2820188	total: 3.8s	remaining: 1.23s
756:	learn: 0.2819586	total: 3.81s	remaining: 1.22s
757:	learn: 0.2819367	total: 3.81s	remaining: 1.22s
758:	learn: 0.2819031	total: 3.82s	remaining: 1.21s
759:	learn: 0.2818543	total: 3.82s	remaining: 1.21s
760:	learn: 0.2817963	total: 3.83s	remaining: 1.2s
761:	learn: 0.2817151	total: 3.83s	remaining: 1.2s
762:	learn: 0.2816598	total: 3.84s	remaining: 1.19s
763:	learn: 0.2816054	total: 3.84s	remaining: 1.19s
764:	learn: 0.2815513	total: 3.85s	remaining: 1.18s
765:	learn: 0.2814786	total: 3.85s	remaining: 1.18s
766:	learn: 0.2814121	total: 3.86s	remaining: 1.17s
767:	learn: 0.2813485	total: 3.86s	remaining: 1.17s
768:	learn: 0.2813034	total: 3.87s	remaining: 1.16s
769:	learn: 0.2812400	total: 3.87s	remaining: 1.16s
770:	learn: 0.2811752	total: 3.88s	remaining: 1.15s
771:	learn: 0.2811315	total: 3.88s	remaining: 1.15s
772:	learn: 0.2810574	total: 3.88s	remaining: 1.14s
773:	learn: 0.2809901	total: 3.89s	remaining: 1.14s
774:	learn: 0.2809321	total: 3.9s	remaining: 1.13s
775:	learn: 0.2808702	total: 3.9s	remaining: 1.13s
776:	learn: 0.2808008	total: 3.9s	remaining: 1.12s
777:	learn: 0.2807478	total: 3.91s	remaining: 1.11s
778:	learn: 0.2806923	total: 3.91s	remaining: 1.11s
779:	learn: 0.2806555	total: 3.92s	remaining: 1.1s
780:	learn: 0.2806233	total: 3.92s	remaining: 1.1s
781:	learn: 0.2805457	total: 3.93s	remaining: 1.09s
782:	learn: 0.2804524	total: 3.93s	remaining: 1.09s
783:	learn: 0.2803787	total: 3.94s	remaining: 1.08s
784:	learn: 0.2803219	total: 3.94s	remaining: 1.08s
785:	learn: 0.2802694	total: 3.95s	remaining: 1.07s
786:	learn: 0.2801942	total: 3.95s	remaining: 1.07s
787:	learn: 0.2801454	total: 3.96s	remaining: 1.06s
788:	learn: 0.2800790	total: 3.96s	remaining: 1.06s
789:	learn: 0.2800200	total: 3.97s	remaining: 1.05s
790:	learn: 0.2799773	total: 3.97s	remaining: 1.05s
791:	learn: 0.2798800	total: 3.98s	remaining: 1.04s
792:	learn: 0.2798363	total: 3.98s	remaining: 1.04s
793:	learn: 0.2797771	total: 3.99s	remaining: 1.03s
794:	learn: 0.2796891	total: 3.99s	remaining: 1.03s
795:	learn: 0.2796074	total: 4s	remaining: 1.02s
796:	learn: 0.2795547	total: 4s	remaining: 1.02s
797:	learn: 0.2795301	total: 4.01s	remaining: 1.01s
798:	learn: 0.2794830	total: 4.01s	remaining: 1.01s
799:	learn: 0.2794245	total: 4.02s	remaining: 1s
800:	learn: 0.2793920	total: 4.02s	remaining: 1000ms
801:	learn: 0.2793311	total: 4.03s	remaining: 995ms
802:	learn: 0.2792535	total: 4.03s	remaining: 990ms
803:	learn: 0.2792016	total: 4.04s	remaining: 984ms
804:	learn: 0.2791145	total: 4.04s	remaining: 979ms
805:	learn: 0.2790810	total: 4.05s	remaining: 974ms
806:	learn: 0.2790362	total: 4.05s	remaining: 969ms
807:	learn: 0.2789659	total: 4.06s	remaining: 964ms
808:	learn: 0.2788849	total: 4.06s	remaining: 959ms
809:	learn: 0.2788345	total: 4.07s	remaining: 954ms
810:	learn: 0.2787676	total: 4.07s	remaining: 949ms
811:	learn: 0.2787124	total: 4.08s	remaining: 944ms
812:	learn: 0.2786522	total: 4.08s	remaining: 939ms
813:	learn: 0.2786166	total: 4.08s	remaining: 934ms
814:	learn: 0.2785403	total: 4.09s	remaining: 928ms
815:	learn: 0.2784643	total: 4.09s	remaining: 923ms
816:	learn: 0.2783996	total: 4.1s	remaining: 918ms
817:	learn: 0.2783489	total: 4.1s	remaining: 913ms
818:	learn: 0.2782636	total: 4.11s	remaining: 908ms
819:	learn: 0.2781995	total: 4.11s	remaining: 903ms
820:	learn: 0.2781649	total: 4.12s	remaining: 898ms
821:	learn: 0.2781171	total: 4.12s	remaining: 893ms
822:	learn: 0.2780640	total: 4.13s	remaining: 888ms
823:	learn: 0.2780233	total: 4.13s	remaining: 883ms
824:	learn: 0.2779794	total: 4.14s	remaining: 878ms
825:	learn: 0.2779369	total: 4.14s	remaining: 873ms
826:	learn: 0.2778760	total: 4.15s	remaining: 868ms
827:	learn: 0.2778437	total: 4.15s	remaining: 863ms
828:	learn: 0.2777736	total: 4.16s	remaining: 858ms
829:	learn: 0.2777112	total: 4.16s	remaining: 853ms
830:	learn: 0.2776350	total: 4.17s	remaining: 847ms
831:	learn: 0.2775754	total: 4.17s	remaining: 842ms
832:	learn: 0.2775170	total: 4.18s	remaining: 838ms
833:	learn: 0.2774614	total: 4.18s	remaining: 833ms
834:	learn: 0.2773855	total: 4.19s	remaining: 828ms
835:	learn: 0.2773253	total: 4.19s	remaining: 823ms
836:	learn: 0.2772417	total: 4.2s	remaining: 818ms
837:	learn: 0.2771836	total: 4.2s	remaining: 813ms
838:	learn: 0.2771140	total: 4.21s	remaining: 808ms
839:	learn: 0.2770543	total: 4.21s	remaining: 803ms
840:	learn: 0.2770010	total: 4.22s	remaining: 798ms
841:	learn: 0.2769633	total: 4.22s	remaining: 793ms
842:	learn: 0.2769073	total: 4.23s	remaining: 788ms
843:	learn: 0.2768557	total: 4.23s	remaining: 782ms
844:	learn: 0.2768106	total: 4.24s	remaining: 777ms
845:	learn: 0.2767772	total: 4.24s	remaining: 772ms
846:	learn: 0.2767138	total: 4.25s	remaining: 767ms
847:	learn: 0.2766378	total: 4.25s	remaining: 762ms
848:	learn: 0.2765508	total: 4.26s	remaining: 757ms
849:	learn: 0.2765165	total: 4.26s	remaining: 752ms
850:	learn: 0.2764529	total: 4.27s	remaining: 747ms
851:	learn: 0.2764013	total: 4.27s	remaining: 742ms
852:	learn: 0.2763387	total: 4.28s	remaining: 737ms
853:	learn: 0.2762832	total: 4.28s	remaining: 732ms
854:	learn: 0.2762526	total: 4.29s	remaining: 727ms
855:	learn: 0.2762197	total: 4.29s	remaining: 722ms
856:	learn: 0.2761707	total: 4.29s	remaining: 717ms
857:	learn: 0.2761278	total: 4.3s	remaining: 712ms
858:	learn: 0.2760558	total: 4.3s	remaining: 707ms
859:	learn: 0.2760351	total: 4.31s	remaining: 701ms
860:	learn: 0.2760010	total: 4.31s	remaining: 696ms
861:	learn: 0.2759165	total: 4.32s	remaining: 691ms
862:	learn: 0.2758426	total: 4.32s	remaining: 686ms
863:	learn: 0.2757892	total: 4.33s	remaining: 681ms
864:	learn: 0.2757221	total: 4.33s	remaining: 676ms
865:	learn: 0.2756896	total: 4.34s	remaining: 671ms
866:	learn: 0.2756437	total: 4.34s	remaining: 666ms
867:	learn: 0.2755723	total: 4.35s	remaining: 661ms
868:	learn: 0.2755102	total: 4.35s	remaining: 656ms
869:	learn: 0.2754664	total: 4.36s	remaining: 651ms
870:	learn: 0.2754167	total: 4.36s	remaining: 646ms
871:	learn: 0.2753492	total: 4.37s	remaining: 641ms
872:	learn: 0.2752987	total: 4.37s	remaining: 636ms
873:	learn: 0.2752409	total: 4.38s	remaining: 631ms
874:	learn: 0.2752157	total: 4.38s	remaining: 626ms
875:	learn: 0.2751573	total: 4.39s	remaining: 621ms
876:	learn: 0.2751281	total: 4.39s	remaining: 616ms
877:	learn: 0.2750619	total: 4.4s	remaining: 611ms
878:	learn: 0.2749958	total: 4.4s	remaining: 606ms
879:	learn: 0.2749751	total: 4.41s	remaining: 601ms
880:	learn: 0.2749189	total: 4.41s	remaining: 596ms
881:	learn: 0.2748456	total: 4.42s	remaining: 591ms
882:	learn: 0.2747660	total: 4.42s	remaining: 586ms
883:	learn: 0.2747076	total: 4.42s	remaining: 581ms
884:	learn: 0.2746716	total: 4.43s	remaining: 576ms
885:	learn: 0.2745968	total: 4.43s	remaining: 571ms
886:	learn: 0.2745524	total: 4.44s	remaining: 566ms
887:	learn: 0.2745103	total: 4.44s	remaining: 561ms
888:	learn: 0.2744648	total: 4.45s	remaining: 556ms
889:	learn: 0.2744270	total: 4.45s	remaining: 551ms
890:	learn: 0.2743754	total: 4.46s	remaining: 546ms
891:	learn: 0.2742897	total: 4.46s	remaining: 540ms
892:	learn: 0.2742480	total: 4.47s	remaining: 535ms
893:	learn: 0.2741973	total: 4.47s	remaining: 530ms
894:	learn: 0.2741160	total: 4.48s	remaining: 525ms
895:	learn: 0.2740572	total: 4.48s	remaining: 520ms
896:	learn: 0.2739981	total: 4.49s	remaining: 515ms
897:	learn: 0.2739702	total: 4.49s	remaining: 511ms
898:	learn: 0.2739333	total: 4.5s	remaining: 505ms
899:	learn: 0.2738680	total: 4.5s	remaining: 501ms
900:	learn: 0.2738214	total: 4.51s	remaining: 495ms
901:	learn: 0.2737956	total: 4.51s	remaining: 490ms
902:	learn: 0.2737329	total: 4.52s	remaining: 485ms
903:	learn: 0.2736833	total: 4.52s	remaining: 480ms
904:	learn: 0.2735941	total: 4.53s	remaining: 475ms
905:	learn: 0.2735616	total: 4.53s	remaining: 470ms
906:	learn: 0.2735189	total: 4.54s	remaining: 465ms
907:	learn: 0.2734678	total: 4.54s	remaining: 460ms
908:	learn: 0.2734034	total: 4.55s	remaining: 455ms
909:	learn: 0.2733447	total: 4.55s	remaining: 450ms
910:	learn: 0.2732624	total: 4.56s	remaining: 445ms
911:	learn: 0.2732230	total: 4.56s	remaining: 440ms
912:	learn: 0.2731526	total: 4.57s	remaining: 435ms
913:	learn: 0.2730805	total: 4.57s	remaining: 430ms
914:	learn: 0.2730221	total: 4.58s	remaining: 425ms
915:	learn: 0.2729519	total: 4.58s	remaining: 420ms
916:	learn: 0.2728963	total: 4.59s	remaining: 415ms
917:	learn: 0.2728419	total: 4.59s	remaining: 410ms
918:	learn: 0.2727652	total: 4.6s	remaining: 405ms
919:	learn: 0.2726949	total: 4.61s	remaining: 400ms
920:	learn: 0.2726209	total: 4.61s	remaining: 395ms
921:	learn: 0.2725610	total: 4.62s	remaining: 390ms
922:	learn: 0.2725068	total: 4.62s	remaining: 385ms
923:	learn: 0.2724367	total: 4.62s	remaining: 380ms
924:	learn: 0.2723742	total: 4.63s	remaining: 375ms
925:	learn: 0.2723031	total: 4.63s	remaining: 370ms
926:	learn: 0.2722465	total: 4.64s	remaining: 365ms
927:	learn: 0.2721789	total: 4.64s	remaining: 360ms
928:	learn: 0.2721426	total: 4.65s	remaining: 355ms
929:	learn: 0.2720855	total: 4.65s	remaining: 350ms
930:	learn: 0.2720281	total: 4.66s	remaining: 345ms
931:	learn: 0.2719737	total: 4.66s	remaining: 340ms
932:	learn: 0.2719428	total: 4.67s	remaining: 335ms
933:	learn: 0.2718783	total: 4.67s	remaining: 330ms
934:	learn: 0.2718036	total: 4.68s	remaining: 325ms
935:	learn: 0.2717498	total: 4.68s	remaining: 320ms
936:	learn: 0.2716687	total: 4.69s	remaining: 315ms
937:	learn: 0.2716072	total: 4.69s	remaining: 310ms
938:	learn: 0.2715490	total: 4.7s	remaining: 305ms
939:	learn: 0.2714593	total: 4.7s	remaining: 300ms
940:	learn: 0.2713786	total: 4.71s	remaining: 295ms
941:	learn: 0.2713249	total: 4.71s	remaining: 290ms
942:	learn: 0.2712782	total: 4.72s	remaining: 285ms
943:	learn: 0.2712259	total: 4.72s	remaining: 280ms
944:	learn: 0.2711444	total: 4.73s	remaining: 275ms
945:	learn: 0.2710927	total: 4.73s	remaining: 270ms
946:	learn: 0.2710409	total: 4.74s	remaining: 265ms
947:	learn: 0.2710017	total: 4.74s	remaining: 260ms
948:	learn: 0.2709340	total: 4.75s	remaining: 255ms
949:	learn: 0.2708971	total: 4.75s	remaining: 250ms
950:	learn: 0.2708522	total: 4.76s	remaining: 245ms
951:	learn: 0.2708165	total: 4.76s	remaining: 240ms
952:	learn: 0.2707339	total: 4.76s	remaining: 235ms
953:	learn: 0.2706978	total: 4.77s	remaining: 230ms
954:	learn: 0.2706478	total: 4.78s	remaining: 225ms
955:	learn: 0.2706090	total: 4.78s	remaining: 220ms
956:	learn: 0.2705652	total: 4.79s	remaining: 215ms
957:	learn: 0.2705105	total: 4.79s	remaining: 210ms
958:	learn: 0.2704592	total: 4.79s	remaining: 205ms
959:	learn: 0.2704040	total: 4.8s	remaining: 200ms
960:	learn: 0.2703662	total: 4.8s	remaining: 195ms
961:	learn: 0.2703178	total: 4.81s	remaining: 190ms
962:	learn: 0.2702412	total: 4.81s	remaining: 185ms
963:	learn: 0.2701647	total: 4.82s	remaining: 180ms
964:	learn: 0.2701102	total: 4.82s	remaining: 175ms
965:	learn: 0.2700498	total: 4.83s	remaining: 170ms
966:	learn: 0.2699676	total: 4.83s	remaining: 165ms
967:	learn: 0.2699127	total: 4.84s	remaining: 160ms
968:	learn: 0.2698586	total: 4.84s	remaining: 155ms
969:	learn: 0.2698003	total: 4.85s	remaining: 150ms
970:	learn: 0.2697571	total: 4.85s	remaining: 145ms
971:	learn: 0.2697041	total: 4.86s	remaining: 140ms
972:	learn: 0.2696757	total: 4.86s	remaining: 135ms
973:	learn: 0.2696186	total: 4.87s	remaining: 130ms
974:	learn: 0.2695722	total: 4.87s	remaining: 125ms
975:	learn: 0.2695103	total: 4.88s	remaining: 120ms
976:	learn: 0.2694553	total: 4.88s	remaining: 115ms
977:	learn: 0.2693918	total: 4.89s	remaining: 110ms
978:	learn: 0.2693534	total: 4.89s	remaining: 105ms
979:	learn: 0.2692817	total: 4.89s	remaining: 99.9ms
980:	learn: 0.2692135	total: 4.9s	remaining: 94.9ms
981:	learn: 0.2691703	total: 4.91s	remaining: 89.9ms
982:	learn: 0.2691042	total: 4.91s	remaining: 84.9ms
983:	learn: 0.2690229	total: 4.92s	remaining: 79.9ms
984:	learn: 0.2689721	total: 4.92s	remaining: 74.9ms
985:	learn: 0.2689120	total: 4.92s	remaining: 69.9ms
986:	learn: 0.2688568	total: 4.93s	remaining: 64.9ms
987:	learn: 0.2687918	total: 4.93s	remaining: 59.9ms
988:	learn: 0.2687408	total: 4.94s	remaining: 54.9ms
989:	learn: 0.2686634	total: 4.94s	remaining: 49.9ms
990:	learn: 0.2685967	total: 4.95s	remaining: 44.9ms
991:	learn: 0.2685596	total: 4.95s	remaining: 39.9ms
992:	learn: 0.2684979	total: 4.96s	remaining: 34.9ms
993:	learn: 0.2684354	total: 4.96s	remaining: 30ms
994:	learn: 0.2683948	total: 4.97s	remaining: 25ms
995:	learn: 0.2683644	total: 4.97s	remaining: 20ms
996:	learn: 0.2682966	total: 4.98s	remaining: 15ms
997:	learn: 0.2682249	total: 4.98s	remaining: 9.98ms
998:	learn: 0.2681457	total: 4.99s	remaining: 4.99ms
999:	learn: 0.2680778	total: 4.99s	remaining: 0us
In [128]:
cat_finalscore
Out[128]:
0.8552683117900509
In [129]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.88      0.94      0.91     19493
           1       0.75      0.58      0.65      6037

    accuracy                           0.86     25530
   macro avg       0.82      0.76      0.78     25530
weighted avg       0.85      0.86      0.85     25530

In [130]:
r={"Catboost":85.5268311}
Results.append(r)
Results
Out[130]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529},
 {'KNeighborsClassifier': 83.6153544},
 {'Gaussian Naive Bayes': 78.7896592},
 {'Gradient Boosting Classifier': 84.55542499},
 {'LightGBM': 85.217391304},
 {'Catboost': 85.5268311}]
In [131]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [ ]:
 

----> Best Model is CatBOOST <----

In [ ]:
 

df_outliers__colna_rm¶

In [132]:
r="df_outliers__colna_rm"
Results.append(r)
Results
Out[132]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529},
 {'KNeighborsClassifier': 83.6153544},
 {'Gaussian Naive Bayes': 78.7896592},
 {'Gradient Boosting Classifier': 84.55542499},
 {'LightGBM': 85.217391304},
 {'Catboost': 85.5268311},
 'df_outliers__colna_rm']
In [133]:
df_outliers__colna_rm ['TempDiff'] = df_outliers__colna_rm ['MaxTemp'] - df_outliers__colna_rm ['MinTemp']
df_outliers__colna_rm =df_outliers__colna_rm .drop(['MaxTemp','MinTemp'],axis=1)
In [134]:
cols = df_outliers__colna_rm .columns.tolist()
cols = cols[-1:] + cols[:-1]
df_outliers__colna_rm  =df_outliers__colna_rm [cols]
df_outliers__colna_rm .info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 113011 entries, 0 to 145458
Data columns (total 15 columns):
 #   Column         Non-Null Count   Dtype         
---  ------         --------------   -----         
 0   TempDiff       113011 non-null  float64       
 1   Date           113011 non-null  datetime64[ns]
 2   Location       113011 non-null  object        
 3   Rainfall       113011 non-null  float64       
 4   WindGustDir    113011 non-null  object        
 5   WindGustSpeed  113011 non-null  float64       
 6   WindDir9am     113011 non-null  object        
 7   WindDir3pm     113011 non-null  object        
 8   WindSpeed9am   113011 non-null  float64       
 9   WindSpeed3pm   113011 non-null  float64       
 10  Humidity9am    113011 non-null  float64       
 11  Humidity3pm    113011 non-null  float64       
 12  Pressure3pm    113011 non-null  float64       
 13  RainToday      113011 non-null  object        
 14  RainTomorrow   113011 non-null  object        
dtypes: datetime64[ns](1), float64(8), object(6)
memory usage: 13.8+ MB
In [135]:
fig, ax = plt.subplots(figsize=(12,8))
mask = np.triu(np.ones_like(df_outliers__colna_rm  .corr(), dtype=np.bool_))
sns.heatmap(df_outliers__colna_rm  .corr(), annot=True, cmap="Blues", mask=mask, linewidth=0.5)
Out[135]:
<Axes: >
In [136]:
df_outliers__colna_rm .info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 113011 entries, 0 to 145458
Data columns (total 15 columns):
 #   Column         Non-Null Count   Dtype         
---  ------         --------------   -----         
 0   TempDiff       113011 non-null  float64       
 1   Date           113011 non-null  datetime64[ns]
 2   Location       113011 non-null  object        
 3   Rainfall       113011 non-null  float64       
 4   WindGustDir    113011 non-null  object        
 5   WindGustSpeed  113011 non-null  float64       
 6   WindDir9am     113011 non-null  object        
 7   WindDir3pm     113011 non-null  object        
 8   WindSpeed9am   113011 non-null  float64       
 9   WindSpeed3pm   113011 non-null  float64       
 10  Humidity9am    113011 non-null  float64       
 11  Humidity3pm    113011 non-null  float64       
 12  Pressure3pm    113011 non-null  float64       
 13  RainToday      113011 non-null  object        
 14  RainTomorrow   113011 non-null  object        
dtypes: datetime64[ns](1), float64(8), object(6)
memory usage: 13.8+ MB
In [137]:
#df_outliers__colna_rm =pd.concat([Date,Location,finalresult2,RainTomorrow],axis=1)
df_outliers__colna_rm .shape
Out[137]:
(113011, 15)

df_outliers__colna_rm ['Date'] = pd.to_numeric(pd.to_datetime(df_outliers__colna_rm ['Date'])) df_outliers__colna_rm .columns = [c.replace(' ', '_') for c in df_outliers__colna_rm ] df_outliers__colna_rm ['Location'] = df_outliers__colna_rm ['Location'].str.lower()

In [138]:
cat_cols
Out[138]:
['Date',
 'Location',
 'WindGustDir',
 'WindDir9am',
 'WindDir3pm',
 'RainToday',
 'RainTomorrow']
In [139]:
le = LabelEncoder()
df_outliers__colna_rm [cat_cols] =df_outliers__colna_rm [cat_cols].astype('str').apply(le.fit_transform)
In [140]:
df_outliers__colna_rm .isnull().sum()
Out[140]:
TempDiff         0
Date             0
Location         0
Rainfall         0
WindGustDir      0
WindGustSpeed    0
WindDir9am       0
WindDir3pm       0
WindSpeed9am     0
WindSpeed3pm     0
Humidity9am      0
Humidity3pm      0
Pressure3pm      0
RainToday        0
RainTomorrow     0
dtype: int64
In [141]:
df_outliers__colna_rm .head()
Out[141]:
TempDiff Date Location Rainfall WindGustDir WindGustSpeed WindDir9am WindDir3pm WindSpeed9am WindSpeed3pm Humidity9am Humidity3pm Pressure3pm RainToday RainTomorrow
0 9.5 377 1 0.6 13 44.0 13 14 20.0 24.0 71.0 22.0 1007.1 0 0
1 17.7 378 1 0.0 14 44.0 6 15 4.0 22.0 44.0 25.0 1007.8 0 0
2 12.8 379 1 0.0 15 46.0 13 15 19.0 26.0 38.0 30.0 1008.7 0 0
3 18.8 380 1 0.0 4 24.0 9 0 11.0 9.0 45.0 16.0 1012.8 0 0
4 14.8 381 1 1.0 13 41.0 1 7 7.0 20.0 82.0 33.0 1006.0 0 0
In [142]:
X = df_outliers__colna_rm .iloc[:, :-1]
y = df_outliers__colna_rm .iloc[:, -1:]

X_train, X_test, y_train, y_test = train_test_split(X, y,
                                                   test_size=0.3, random_state = 1)
In [ ]:
 
In [143]:
#Logistic Regression
from sklearn.metrics import classification_report, confusion_matrix
from sklearn.linear_model import LogisticRegression
lr = LogisticRegression()
lr.fit(X_train,y_train)
Out[143]:
LogisticRegression()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
LogisticRegression()
In [144]:
lr.intercept_
Out[144]:
array([0.00014398])
In [145]:
lr.coef_
Out[145]:
array([[-6.60270148e-03, -7.14789638e-06, -7.73999241e-03,
         2.35565964e-02,  1.79738076e-03,  7.92984634e-02,
        -2.14100804e-02, -2.82528264e-03, -2.19958567e-02,
        -4.21983604e-02,  7.26166767e-03,  6.71139781e-02,
        -7.32271722e-03,  4.25489699e-03]])
In [ ]:
 
In [146]:
predictions = lr.predict(X_test)
lr.score(X_train,y_train)
Out[146]:
0.8455130393011996
In [147]:
r={"LogisticRegression":84.55130}
Results.append(r)
Results
Out[147]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529},
 {'KNeighborsClassifier': 83.6153544},
 {'Gaussian Naive Bayes': 78.7896592},
 {'Gradient Boosting Classifier': 84.55542499},
 {'LightGBM': 85.217391304},
 {'Catboost': 85.5268311},
 'df_outliers__colna_rm',
 {'LogisticRegression': 84.5513}]
In [148]:
print(confusion_matrix(y_test, predictions))
cm = confusion_matrix(y_test, predictions)

fig, ax = plt.subplots(figsize=(8, 8))
ax.imshow(cm)
ax.grid(False)
ax.xaxis.set(ticks=(0, 1), ticklabels=('Predicted 0s', 'Predicted 1s'))
ax.yaxis.set(ticks=(0, 1), ticklabels=('Actual 0s', 'Actual 1s'))
ax.set_ylim(1.5, -0.5)
for i in range(2):
    for j in range(2):
        ax.text(j, i, cm[i, j], ha='center', va='center', color='red')
plt.show()
[[25120  1344]
 [ 3937  3503]]
In [149]:
    print(classification_report(y_test, predictions))
              precision    recall  f1-score   support

           0       0.86      0.95      0.90     26464
           1       0.72      0.47      0.57      7440

    accuracy                           0.84     33904
   macro avg       0.79      0.71      0.74     33904
weighted avg       0.83      0.84      0.83     33904

In [150]:
print(accuracy_score(y_test, predictions))
0.8442366682397358
In [ ]:
 
In [151]:
#ROC Curve
from sklearn.metrics import roc_auc_score
from sklearn.metrics import roc_curve
logit_roc_auc = roc_auc_score(y_test, predictions)
fpr, tpr, thresholds = roc_curve(y_test, lr.predict_proba(X_test)[:,1])
plt.figure()
plt.plot(fpr, tpr, label='Logistic Regression (area = %0.2f)' % logit_roc_auc)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.savefig('Log_ROC')
plt.show()

XGBoost Documentation¶

XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples.

Gradient boosting is a machine learning technique used in regression and classification tasks, among others. It gives a prediction model in the form of an ensemble of weak prediction models, which are typically decision trees.[1][2] When a decision tree is the weak learner, the resulting algorithm is called gradient-boosted trees; it usually outperforms random forest.[1][2][3] A gradient-boosted trees model is built in a stage-wise fashion as in other boosting methods, but it generalizes the other methods by allowing optimization of an arbitrary differentiable loss function.

In [152]:
from xgboost import XGBClassifier
In [153]:
xgb_model = XGBClassifier().fit(X_train, y_train)
In [154]:
y_pred = xgb_model.predict(X_test)
In [155]:
accuracy_score(y_test, y_pred)
Out[155]:
0.8604884379424257
In [156]:
r={"XGBoost":86.0488437}
Results.append(r)
Results
Out[156]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529},
 {'KNeighborsClassifier': 83.6153544},
 {'Gaussian Naive Bayes': 78.7896592},
 {'Gradient Boosting Classifier': 84.55542499},
 {'LightGBM': 85.217391304},
 {'Catboost': 85.5268311},
 'df_outliers__colna_rm',
 {'LogisticRegression': 84.5513},
 {'XGBoost': 86.0488437}]
In [157]:
from sklearn.metrics import mean_squared_error
rmse = mean_squared_error(y_test,y_pred, squared=False)
print(f"RMSE of the base model: {rmse:.3f}")
RMSE of the base model: 0.374
In [158]:
print(confusion_matrix(y_test, y_pred))
cm = confusion_matrix(y_test, y_pred)

fig, ax = plt.subplots(figsize=(8, 8))
ax.imshow(cm)
ax.grid(False)
ax.xaxis.set(ticks=(0, 1), ticklabels=('Predicted 0s', 'Predicted 1s'))
ax.yaxis.set(ticks=(0, 1), ticklabels=('Actual 0s', 'Actual 1s'))
ax.set_ylim(1.5, -0.5)
for i in range(2):
    for j in range(2):
        ax.text(j, i, cm[i, j], ha='center', va='center', color='red')
plt.show()
[[25023  1441]
 [ 3289  4151]]
In [159]:
# Compute micro-average ROC curve and ROC area
xgb_roc_auc = roc_auc_score(y_test, y_pred)
fpr, tpr, thresholds = roc_curve(y_test, xgb_model.predict_proba(X_test)[:,1])
plt.figure()
plt.plot(fpr, tpr, label='xgb_model (area = %0.2f)' % logit_roc_auc)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.savefig('Log_ROC')
plt.show()
In [160]:
print(classification_report(y_test, y_pred))
              precision    recall  f1-score   support

           0       0.88      0.95      0.91     26464
           1       0.74      0.56      0.64      7440

    accuracy                           0.86     33904
   macro avg       0.81      0.75      0.78     33904
weighted avg       0.85      0.86      0.85     33904

In [161]:
#RandomForestClassifier
rf= RandomForestClassifier()
rf.fit(X_train, y_train)
y_pred = rf.predict(X_test)
In [162]:
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)
Accuracy: 0.8545304388862671
In [163]:
r={"RandomForestClassifier":85.343912}
Results.append(r)
In [164]:
# Export the first three decision trees from the forest
from sklearn.tree import export_graphviz
from IPython.display import Image
import graphviz
for i in range(3):
    tree = rf.estimators_[i]
    dot_data = export_graphviz(tree,
                               feature_names=X_train.columns,  
                               filled=True,  
                               max_depth=2, 
                               impurity=False, 
                               proportion=True)
    graph = graphviz.Source(dot_data)
    display(graph)
Tree 0 Rainfall <= 0.75 samples = 100.0% value = [0.775, 0.225] 1 Humidity9am <= 68.5 samples = 75.0% value = [0.853, 0.147] 0->1 True 12004 TempDiff <= 7.15 samples = 25.0% value = [0.542, 0.458] 0->12004 False 2 WindGustSpeed <= 55.0 samples = 44.9% value = [0.892, 0.108] 1->2 6045 WindGustSpeed <= 47.0 samples = 30.0% value = [0.795, 0.205] 1->6045 3 (...) 2->3 4584 (...) 2->4584 6046 (...) 6045->6046 10645 (...) 6045->10645 12005 Pressure3pm <= 1011.95 samples = 13.2% value = [0.42, 0.58] 12004->12005 15410 Humidity3pm <= 65.5 samples = 11.9% value = [0.678, 0.322] 12004->15410 12006 (...) 12005->12006 13133 (...) 12005->13133 15411 (...) 15410->15411 17292 (...) 15410->17292
Tree 0 TempDiff <= 7.55 samples = 100.0% value = [0.777, 0.223] 1 Rainfall <= 2.15 samples = 28.3% value = [0.556, 0.444] 0->1 True 7268 Rainfall <= 0.55 samples = 71.7% value = [0.865, 0.135] 0->7268 False 2 Pressure3pm <= 1012.05 samples = 17.2% value = [0.663, 0.337] 1->2 4493 Humidity3pm <= 74.5 samples = 11.2% value = [0.39, 0.61] 1->4493 3 (...) 2->3 1532 (...) 2->1532 4494 (...) 4493->4494 6193 (...) 4493->6193 7269 TempDiff <= 10.85 samples = 60.3% value = [0.893, 0.107] 7268->7269 15532 Humidity9am <= 75.5 samples = 11.3% value = [0.712, 0.288] 7268->15532 7270 (...) 7269->7270 10579 (...) 7269->10579 15533 (...) 15532->15533 16510 (...) 15532->16510
Tree 0 Humidity3pm <= 67.5 samples = 100.0% value = [0.776, 0.224] 1 TempDiff <= 9.45 samples = 78.8% value = [0.872, 0.128] 0->1 True 12506 Humidity3pm <= 81.5 samples = 21.2% value = [0.415, 0.585] 0->12506 False 2 RainToday <= 0.5 samples = 25.1% value = [0.788, 0.212] 1->2 5893 RainToday <= 0.5 samples = 53.8% value = [0.912, 0.088] 1->5893 3 (...) 2->3 3632 (...) 2->3632 5894 (...) 5893->5894 11671 (...) 5893->11671 12507 Rainfall <= 2.35 samples = 13.6% value = [0.552, 0.448] 12506->12507 16642 WindGustSpeed <= 38.0 samples = 7.6% value = [0.175, 0.825] 12506->16642 12508 (...) 12507->12508 15221 (...) 12507->15221 16643 (...) 16642->16643 17328 (...) 16642->17328
In [ ]:
 
In [165]:
# Create the confusion matrix
from sklearn.metrics import ConfusionMatrixDisplay
cm = confusion_matrix(y_test, y_pred)

ConfusionMatrixDisplay(confusion_matrix=cm).plot();
In [166]:
accuracy = accuracy_score(y_test, y_pred)
precision = precision_score(y_test, y_pred)
recall = recall_score(y_test, y_pred)
from sklearn.metrics import mean_absolute_error, mean_squared_error

print("Accuracy:", accuracy)
print("Precision:", precision)
print("Recall:", recall)

print('Mean Absolute Error:', mean_absolute_error(y_test, y_pred))
print('Mean Squared Error:', mean_squared_error(y_test, y_pred))
print('Root Mean Squared Error:', np.sqrt(mean_squared_error(y_test, y_pred)))
Accuracy: 0.8545304388862671
Precision: 0.751101321585903
Recall: 0.5041666666666667
Mean Absolute Error: 0.14546956111373288
Mean Squared Error: 0.14546956111373288
Root Mean Squared Error: 0.3814047208854826
In [167]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.87      0.95      0.91     26464
           1       0.75      0.50      0.60      7440

    accuracy                           0.85     33904
   macro avg       0.81      0.73      0.76     33904
weighted avg       0.85      0.85      0.84     33904

In [168]:
# Organizing feature names and importances in a DataFrame
features_df = pd.DataFrame({'features': rf.feature_names_in_, 'importances': rf.feature_importances_ })

# Sorting data from highest to lowest
features_df_sorted = features_df.sort_values(by='importances', ascending=False)

# Barplot of the result without borders and axis lines
g = sns.barplot(data=features_df_sorted, x='importances', y ='features', palette="rocket")
sns.despine(bottom = True, left = True)
g.set_title('Feature importances')
g.set(xlabel=None)
g.set(ylabel=None)
g.set(xticks=[])
for value in g.containers:
    g.bar_label(value, padding=2)
In [169]:
#KNeighborsClassifier
In [170]:
#KNeighborsClassifier
knn = KNeighborsClassifier()
knn.fit(X_train, y_train)
knn_head = knn.predict(X_test)

print(f"""
accuracy_score: {accuracy_score(knn_head, y_test)}
roc_auc_score: {roc_auc_score(knn_head, y_test)}
""")
accuracy_score: 0.8370988673902784
roc_auc_score: 0.7765035257609642

In [171]:
def found_good_neighbors_1(n, p):
    knn = KNeighborsClassifier(n_neighbors=n, p=p, 
                               metric='minkowski')
    knn.fit(X_train, y_train)
    return knn.score(X_test, y_test)

def found_goot_depth(n, criterion_):
    tree = DecisionTreeClassifier(max_depth=n, 
                                  criterion=criterion_,
                                  random_state=42)
    tree.fit(X_train, y_train)
    return tree.score(X_test, y_test)
In [172]:
knn_1 = [found_good_neighbors_1(n, 1) for n in range(1, 22, 2)]
knn_2 = [found_good_neighbors_1(n, 2) for n in range(1, 22, 2)]
In [173]:
tree_gini = [found_goot_depth(n, 'gini') for n in range(1, 22, 2)]
tree_entropy = [found_goot_depth(n, 'entropy') for n in range(1, 22, 2)]
In [174]:
len(knn_1)
Out[174]:
11
In [175]:
l=knn_1[0]
a=1
for i in range(0,10):
    if (knn_1[i]>l):
        l=knn_1[i]
        if(i==0):
            a=1
        else:
            a=(i*2)+1
            
       
print (l,a)
0.8455639452571968 11
In [176]:
plt.figure(figsize=(12, 7))
plt.subplot(2, 2, 1)
plt.plot(tree_gini)
plt.title('tree_gini')
plt.legend(['score'])
plt.subplot(2, 2, 2)
plt.plot(tree_entropy)
plt.title('tree_entropy')
plt.legend(['score'])
plt.subplot(2, 2, 3)
plt.plot(knn_1)
plt.title('knn_1')
plt.legend(['score'])
plt.subplot(2, 2, 4)
plt.plot(knn_2)
plt.title('knn_2')
plt.legend(['score'])
plt.show()
In [177]:
print(f"""
tree_gini: {max(tree_gini)}
tree_entropy: {max(tree_entropy)}
knn_1: {max(knn_1)}
knn_2: {max(knn_2)}
""")
tree_gini: 0.8441186880604058
tree_entropy: 0.8443251533742331
knn_1: 0.8456819254365266
knn_2: 0.8440596979707409

As we can see the decisive trees begin to fall at a depth of 4-5. What we cannot say about the nearest-neighbor method. I think we should still do tests starting from 20 to 50 in increments of 3 for nearest neighbours

In [178]:
knn_1 = [found_good_neighbors_1(n, 1) for n in range(20, 51, 3)]
knn_2 = [found_good_neighbors_1(n, 2) for n in range(20, 51, 3)]
In [179]:
plt.figure(figsize=(14, 9))
plt.subplot(2,2,1)
plt.plot(knn_1)
plt.title('knn_1')
plt.legend(['score'])
plt.subplot(2, 2, 2)
plt.plot(knn_2)
plt.title('knn_2')
plt.legend(['score'])
plt.show()
In [180]:
print(f"""
knn_1: {max(knn_1)}
knn_2: {max(knn_2)}
""")
knn_1: 0.8450920245398773
knn_2: 0.8440302029259085

In [181]:
#knn_1: 0.836153544849197 at k=9
knn = KNeighborsClassifier(n_neighbors=11, p=1, 
                               metric='minkowski')
knn.fit(X_train, y_train)
knn_head = knn.predict(X_test)

print(f"""
accuracy_score: {accuracy_score(knn_head, y_test)}
roc_auc_score: {roc_auc_score(knn_head, y_test)}
""")
accuracy_score: 0.8455639452571968
roc_auc_score: 0.8083671461594649

In [182]:
r={"knn":84.5563}
Results.append(r)
In [183]:
# Evaluate Model
# Create the confusion matrix
#from sklearn.metrics import ConfusionMatrixDisplay
cm = confusion_matrix(y_test, knn_head)

ConfusionMatrixDisplay(confusion_matrix=cm).plot();
In [184]:
print(classification_report(y_test,knn_head))
              precision    recall  f1-score   support

           0       0.86      0.96      0.91     26464
           1       0.76      0.43      0.55      7440

    accuracy                           0.85     33904
   macro avg       0.81      0.70      0.73     33904
weighted avg       0.84      0.85      0.83     33904

Gaussian Naive Bayes¶

The Naive Bayes algorithm calculates the probability of a sample belonging to a particular class given its features. It assumes that the features are conditionally independent of each other, given the class variable. This is known as the "naive" assumption, as it assumes independence between features, which may not hold in reality. However, despite this simplifying assumption, Naive Bayes has been shown to perform well in practice, especially in situations where the independence assumption is not severely violated.

the Gaussian Naive Bayes model assumes that the features are continuous and normally distributed, and it uses Bayes' theorem to calculate the probabilities of the sample belonging to each class. Despite its simplicity and the naive assumption of feature independence, it often performs well in practice and is widely used for classification tasks, especially when dealing with continuous-valued features.

In [185]:
from sklearn.naive_bayes import GaussianNB
gnb = GaussianNB()
gnb.fit(X_train, y_train)
y_pred = gnb.predict(X_test)
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)
report = classification_report(y_test, y_pred)
print("Classification Report:\n", report)
Accuracy: 0.8040349221330817
Classification Report:
               precision    recall  f1-score   support

           0       0.87      0.88      0.87     26464
           1       0.55      0.54      0.55      7440

    accuracy                           0.80     33904
   macro avg       0.71      0.71      0.71     33904
weighted avg       0.80      0.80      0.80     33904

In [186]:
r={"Gaussian Naive Bayes":80.4034922}
Results.append(r)
In [187]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [188]:
from sklearn.metrics import roc_auc_score

ROC_AUC = roc_auc_score(y_test, y_pred)

print('ROC AUC : {:.4f}'.format(ROC_AUC))
ROC AUC : 0.7095
In [189]:
# calculate cross-validated ROC AUC 

from sklearn.model_selection import cross_val_score

Cross_validated_ROC_AUC = cross_val_score(gnb, X_train, y_train, cv=5, scoring='roc_auc').mean()

print('Cross validated ROC AUC : {:.4f}'.format(Cross_validated_ROC_AUC))
Cross validated ROC AUC : 0.8212

Gradient Boosting Classifier¶

Gradient Boosting Classifier is a machine learning algorithm that belongs to the family of ensemble methods. It combines multiple weak learners, typically decision trees, to create a stronger predictive model. It is known for its high accuracy and ability to handle complex datasets.

The basic idea behind gradient boosting is to iteratively train a series of weak models, where each subsequent model is built to correct the mistakes made by the previous models. The models are trained in a stage-wise fashion, where each stage focuses on minimizing a loss function using gradient descent optimization.

In [190]:
from sklearn.ensemble import GradientBoostingClassifier
gbm_model = GradientBoostingClassifier().fit(X_train, y_train)
y_pred = gbm_model.predict(X_test)
accuracy_score(y_test, y_pred)
Out[190]:
0.8516694195375177
In [191]:
r={"Gradient Boosting Classifier":85.1669419}
Results.append(r)
In [192]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [193]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.87      0.95      0.91     26464
           1       0.74      0.50      0.60      7440

    accuracy                           0.85     33904
   macro avg       0.80      0.73      0.75     33904
weighted avg       0.84      0.85      0.84     33904

LightGBM¶

LightGBM is a gradient boosting framework that is designed to be efficient and highly scalable. It is an open-source library developed by Microsoft and has gained popularity in the machine learning community for its speed and performance.

Gradient boosting algorithm: LightGBM is based on the gradient boosting algorithm, similar to other boosting frameworks like XGBoost and CatBoost. It builds an ensemble of weak models, typically decision trees, to create a stronger predictive model.

Unlike traditional depth-wise tree growth, LightGBM uses a leaf-wise approach. This means that the tree is grown by expanding the leaf with the maximum loss reduction, leading to a more balanced tree structure and reducing the number of levels in the tree.

In [194]:
from lightgbm import LGBMClassifier
lgbm_model = LGBMClassifier().fit(X_train, y_train)
y_pred = lgbm_model.predict(X_test)
accuracy_score(y_test, y_pred)
Out[194]:
0.8581878244454931
In [195]:
r={"LightGBM":85.818782}
Results.append(r)
In [196]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [197]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.88      0.95      0.91     26464
           1       0.75      0.54      0.62      7440

    accuracy                           0.86     33904
   macro avg       0.81      0.74      0.77     33904
weighted avg       0.85      0.86      0.85     33904

In [198]:
print('Training accuracy {:.4f}'.format(lgbm_model.score(X_train,y_train)))
print('Testing accuracy {:.4f}'.format(lgbm_model.score(X_test,y_test)))
Training accuracy 0.8696
Testing accuracy 0.8582
In [199]:
# As we can clearly see that there is absolutely no significant difference between both the accuracies and hence the model has made an estimation that is quite accurate.
In [200]:
import lightgbm as lgb
lgb.plot_importance(lgbm_model)
Out[200]:
<Axes: title={'center': 'Feature importance'}, xlabel='Feature importance', ylabel='Features'>
In [201]:
lgb.plot_tree(lgbm_model,figsize=(30,40))
Out[201]:
<Axes: >
In [ ]:
 

Catboost¶

CatBoost is a gradient boosting algorithm that is known for its ability to handle categorical features effectively. It is an open-source machine learning library developed by Yandex and offers a number of features that make it a popular choice for various classification and regression tasks.

CatBoost incorporates an innovative approach to handling categorical features. It uses a combination of ordered boosting and a novel method called "gradient-based One-Hot Encoding" (OB+OHE). This approach efficiently encodes

CatBoost has a built-in mechanism to handle missing values in the data. During training, it automatically learns how to treat missing values without requiring any additional preprocessing steps.

CatBoost provides GPU acceleration, allowing for faster training and inference on compatible hardware. This is particularly beneficial when dealing with large datasets or complex models.

In [202]:
from catboost import CatBoostClassifier, Pool
cat = CatBoostClassifier()
cat.fit(X_train, y_train)
y_pred = cat.predict(X_test)
cat_finalscore = accuracy_score(y_test, y_pred)
Learning rate set to 0.0666
0:	learn: 0.6424319	total: 10.8ms	remaining: 10.8s
1:	learn: 0.6012697	total: 16.8ms	remaining: 8.39s
2:	learn: 0.5673068	total: 23.4ms	remaining: 7.76s
3:	learn: 0.5398692	total: 30ms	remaining: 7.47s
4:	learn: 0.5148278	total: 36.1ms	remaining: 7.18s
5:	learn: 0.4931305	total: 42.4ms	remaining: 7.02s
6:	learn: 0.4744792	total: 48.7ms	remaining: 6.91s
7:	learn: 0.4601179	total: 55.5ms	remaining: 6.88s
8:	learn: 0.4460556	total: 62.5ms	remaining: 6.88s
9:	learn: 0.4339956	total: 68.7ms	remaining: 6.8s
10:	learn: 0.4246277	total: 75.3ms	remaining: 6.77s
11:	learn: 0.4178817	total: 81.7ms	remaining: 6.73s
12:	learn: 0.4105547	total: 88ms	remaining: 6.68s
13:	learn: 0.4036674	total: 94.3ms	remaining: 6.64s
14:	learn: 0.3982159	total: 101ms	remaining: 6.62s
15:	learn: 0.3939568	total: 107ms	remaining: 6.61s
16:	learn: 0.3894763	total: 114ms	remaining: 6.58s
17:	learn: 0.3862263	total: 121ms	remaining: 6.59s
18:	learn: 0.3830167	total: 127ms	remaining: 6.57s
19:	learn: 0.3794842	total: 133ms	remaining: 6.54s
20:	learn: 0.3775000	total: 140ms	remaining: 6.52s
21:	learn: 0.3751508	total: 146ms	remaining: 6.49s
22:	learn: 0.3732912	total: 153ms	remaining: 6.49s
23:	learn: 0.3713106	total: 160ms	remaining: 6.5s
24:	learn: 0.3694772	total: 167ms	remaining: 6.51s
25:	learn: 0.3675227	total: 174ms	remaining: 6.51s
26:	learn: 0.3657068	total: 180ms	remaining: 6.48s
27:	learn: 0.3645348	total: 186ms	remaining: 6.44s
28:	learn: 0.3631949	total: 191ms	remaining: 6.4s
29:	learn: 0.3618993	total: 197ms	remaining: 6.38s
30:	learn: 0.3606025	total: 204ms	remaining: 6.37s
31:	learn: 0.3596921	total: 209ms	remaining: 6.31s
32:	learn: 0.3588365	total: 215ms	remaining: 6.29s
33:	learn: 0.3579730	total: 221ms	remaining: 6.27s
34:	learn: 0.3569619	total: 227ms	remaining: 6.26s
35:	learn: 0.3560435	total: 236ms	remaining: 6.32s
36:	learn: 0.3553735	total: 242ms	remaining: 6.3s
37:	learn: 0.3547847	total: 249ms	remaining: 6.3s
38:	learn: 0.3539530	total: 255ms	remaining: 6.29s
39:	learn: 0.3534094	total: 262ms	remaining: 6.28s
40:	learn: 0.3526585	total: 268ms	remaining: 6.28s
41:	learn: 0.3521005	total: 275ms	remaining: 6.27s
42:	learn: 0.3517317	total: 280ms	remaining: 6.24s
43:	learn: 0.3512112	total: 286ms	remaining: 6.21s
44:	learn: 0.3506755	total: 292ms	remaining: 6.2s
45:	learn: 0.3501436	total: 298ms	remaining: 6.19s
46:	learn: 0.3498008	total: 304ms	remaining: 6.17s
47:	learn: 0.3493155	total: 310ms	remaining: 6.15s
48:	learn: 0.3489157	total: 317ms	remaining: 6.16s
49:	learn: 0.3485093	total: 324ms	remaining: 6.15s
50:	learn: 0.3480594	total: 330ms	remaining: 6.14s
51:	learn: 0.3477583	total: 338ms	remaining: 6.16s
52:	learn: 0.3474241	total: 344ms	remaining: 6.15s
53:	learn: 0.3470711	total: 351ms	remaining: 6.15s
54:	learn: 0.3466603	total: 357ms	remaining: 6.14s
55:	learn: 0.3464120	total: 364ms	remaining: 6.14s
56:	learn: 0.3460341	total: 370ms	remaining: 6.12s
57:	learn: 0.3457509	total: 377ms	remaining: 6.12s
58:	learn: 0.3454310	total: 383ms	remaining: 6.11s
59:	learn: 0.3450102	total: 390ms	remaining: 6.1s
60:	learn: 0.3447383	total: 396ms	remaining: 6.1s
61:	learn: 0.3445343	total: 403ms	remaining: 6.1s
62:	learn: 0.3442544	total: 410ms	remaining: 6.1s
63:	learn: 0.3439380	total: 416ms	remaining: 6.08s
64:	learn: 0.3435929	total: 423ms	remaining: 6.09s
65:	learn: 0.3432915	total: 430ms	remaining: 6.09s
66:	learn: 0.3430875	total: 436ms	remaining: 6.08s
67:	learn: 0.3428479	total: 443ms	remaining: 6.07s
68:	learn: 0.3426724	total: 449ms	remaining: 6.06s
69:	learn: 0.3424515	total: 456ms	remaining: 6.05s
70:	learn: 0.3422186	total: 462ms	remaining: 6.04s
71:	learn: 0.3420162	total: 468ms	remaining: 6.04s
72:	learn: 0.3417554	total: 475ms	remaining: 6.03s
73:	learn: 0.3414802	total: 481ms	remaining: 6.02s
74:	learn: 0.3413259	total: 489ms	remaining: 6.03s
75:	learn: 0.3411209	total: 496ms	remaining: 6.03s
76:	learn: 0.3408738	total: 502ms	remaining: 6.02s
77:	learn: 0.3405632	total: 509ms	remaining: 6.01s
78:	learn: 0.3402655	total: 515ms	remaining: 6.01s
79:	learn: 0.3399846	total: 521ms	remaining: 6s
80:	learn: 0.3398380	total: 528ms	remaining: 5.99s
81:	learn: 0.3393981	total: 534ms	remaining: 5.98s
82:	learn: 0.3390869	total: 541ms	remaining: 5.98s
83:	learn: 0.3389126	total: 547ms	remaining: 5.97s
84:	learn: 0.3388116	total: 554ms	remaining: 5.96s
85:	learn: 0.3386195	total: 560ms	remaining: 5.95s
86:	learn: 0.3384712	total: 567ms	remaining: 5.95s
87:	learn: 0.3382318	total: 573ms	remaining: 5.94s
88:	learn: 0.3380487	total: 579ms	remaining: 5.93s
89:	learn: 0.3378702	total: 585ms	remaining: 5.92s
90:	learn: 0.3377044	total: 592ms	remaining: 5.91s
91:	learn: 0.3374927	total: 598ms	remaining: 5.9s
92:	learn: 0.3373755	total: 605ms	remaining: 5.9s
93:	learn: 0.3371374	total: 613ms	remaining: 5.91s
94:	learn: 0.3368986	total: 621ms	remaining: 5.92s
95:	learn: 0.3367790	total: 629ms	remaining: 5.92s
96:	learn: 0.3366519	total: 636ms	remaining: 5.92s
97:	learn: 0.3364279	total: 642ms	remaining: 5.91s
98:	learn: 0.3362472	total: 650ms	remaining: 5.91s
99:	learn: 0.3361227	total: 657ms	remaining: 5.92s
100:	learn: 0.3358804	total: 665ms	remaining: 5.92s
101:	learn: 0.3356367	total: 673ms	remaining: 5.92s
102:	learn: 0.3355336	total: 680ms	remaining: 5.92s
103:	learn: 0.3353390	total: 688ms	remaining: 5.92s
104:	learn: 0.3351747	total: 695ms	remaining: 5.93s
105:	learn: 0.3350008	total: 703ms	remaining: 5.93s
106:	learn: 0.3348530	total: 711ms	remaining: 5.93s
107:	learn: 0.3346334	total: 721ms	remaining: 5.95s
108:	learn: 0.3344531	total: 729ms	remaining: 5.96s
109:	learn: 0.3343338	total: 737ms	remaining: 5.96s
110:	learn: 0.3341528	total: 747ms	remaining: 5.98s
111:	learn: 0.3340157	total: 754ms	remaining: 5.98s
112:	learn: 0.3337862	total: 761ms	remaining: 5.97s
113:	learn: 0.3336417	total: 768ms	remaining: 5.97s
114:	learn: 0.3334844	total: 776ms	remaining: 5.97s
115:	learn: 0.3333017	total: 783ms	remaining: 5.96s
116:	learn: 0.3331718	total: 790ms	remaining: 5.96s
117:	learn: 0.3330213	total: 798ms	remaining: 5.96s
118:	learn: 0.3329447	total: 805ms	remaining: 5.96s
119:	learn: 0.3327646	total: 814ms	remaining: 5.97s
120:	learn: 0.3326353	total: 820ms	remaining: 5.96s
121:	learn: 0.3324476	total: 827ms	remaining: 5.95s
122:	learn: 0.3323190	total: 834ms	remaining: 5.94s
123:	learn: 0.3321358	total: 840ms	remaining: 5.94s
124:	learn: 0.3320130	total: 847ms	remaining: 5.93s
125:	learn: 0.3319212	total: 854ms	remaining: 5.92s
126:	learn: 0.3318141	total: 860ms	remaining: 5.91s
127:	learn: 0.3316599	total: 867ms	remaining: 5.91s
128:	learn: 0.3315383	total: 874ms	remaining: 5.9s
129:	learn: 0.3314282	total: 880ms	remaining: 5.89s
130:	learn: 0.3313254	total: 887ms	remaining: 5.88s
131:	learn: 0.3311779	total: 893ms	remaining: 5.87s
132:	learn: 0.3310823	total: 901ms	remaining: 5.87s
133:	learn: 0.3309699	total: 908ms	remaining: 5.87s
134:	learn: 0.3308336	total: 915ms	remaining: 5.86s
135:	learn: 0.3306808	total: 924ms	remaining: 5.87s
136:	learn: 0.3305342	total: 936ms	remaining: 5.9s
137:	learn: 0.3304361	total: 946ms	remaining: 5.91s
138:	learn: 0.3302446	total: 953ms	remaining: 5.9s
139:	learn: 0.3301647	total: 958ms	remaining: 5.88s
140:	learn: 0.3299307	total: 964ms	remaining: 5.87s
141:	learn: 0.3298004	total: 970ms	remaining: 5.86s
142:	learn: 0.3297295	total: 976ms	remaining: 5.85s
143:	learn: 0.3295791	total: 982ms	remaining: 5.84s
144:	learn: 0.3294520	total: 989ms	remaining: 5.83s
145:	learn: 0.3293741	total: 996ms	remaining: 5.83s
146:	learn: 0.3292466	total: 1s	remaining: 5.82s
147:	learn: 0.3291468	total: 1.01s	remaining: 5.81s
148:	learn: 0.3289369	total: 1.01s	remaining: 5.8s
149:	learn: 0.3287623	total: 1.02s	remaining: 5.79s
150:	learn: 0.3286447	total: 1.03s	remaining: 5.77s
151:	learn: 0.3285819	total: 1.03s	remaining: 5.76s
152:	learn: 0.3284804	total: 1.04s	remaining: 5.75s
153:	learn: 0.3283331	total: 1.04s	remaining: 5.73s
154:	learn: 0.3280846	total: 1.05s	remaining: 5.72s
155:	learn: 0.3279085	total: 1.05s	remaining: 5.7s
156:	learn: 0.3277879	total: 1.06s	remaining: 5.69s
157:	learn: 0.3276637	total: 1.06s	remaining: 5.68s
158:	learn: 0.3275897	total: 1.07s	remaining: 5.67s
159:	learn: 0.3274625	total: 1.08s	remaining: 5.65s
160:	learn: 0.3273712	total: 1.08s	remaining: 5.64s
161:	learn: 0.3272611	total: 1.09s	remaining: 5.63s
162:	learn: 0.3271765	total: 1.09s	remaining: 5.61s
163:	learn: 0.3270500	total: 1.1s	remaining: 5.6s
164:	learn: 0.3269383	total: 1.1s	remaining: 5.59s
165:	learn: 0.3267676	total: 1.11s	remaining: 5.58s
166:	learn: 0.3266984	total: 1.11s	remaining: 5.57s
167:	learn: 0.3265572	total: 1.12s	remaining: 5.55s
168:	learn: 0.3264623	total: 1.13s	remaining: 5.54s
169:	learn: 0.3263896	total: 1.13s	remaining: 5.53s
170:	learn: 0.3262384	total: 1.14s	remaining: 5.52s
171:	learn: 0.3261129	total: 1.15s	remaining: 5.51s
172:	learn: 0.3259579	total: 1.15s	remaining: 5.5s
173:	learn: 0.3258123	total: 1.16s	remaining: 5.49s
174:	learn: 0.3256688	total: 1.16s	remaining: 5.48s
175:	learn: 0.3255034	total: 1.17s	remaining: 5.47s
176:	learn: 0.3253788	total: 1.17s	remaining: 5.46s
177:	learn: 0.3252749	total: 1.18s	remaining: 5.45s
178:	learn: 0.3251587	total: 1.19s	remaining: 5.44s
179:	learn: 0.3250304	total: 1.19s	remaining: 5.44s
180:	learn: 0.3249477	total: 1.2s	remaining: 5.43s
181:	learn: 0.3248128	total: 1.21s	remaining: 5.42s
182:	learn: 0.3247150	total: 1.21s	remaining: 5.41s
183:	learn: 0.3245753	total: 1.22s	remaining: 5.41s
184:	learn: 0.3244316	total: 1.22s	remaining: 5.39s
185:	learn: 0.3243208	total: 1.23s	remaining: 5.38s
186:	learn: 0.3242209	total: 1.24s	remaining: 5.37s
187:	learn: 0.3241426	total: 1.24s	remaining: 5.36s
188:	learn: 0.3240103	total: 1.25s	remaining: 5.35s
189:	learn: 0.3238847	total: 1.25s	remaining: 5.33s
190:	learn: 0.3237921	total: 1.26s	remaining: 5.33s
191:	learn: 0.3236879	total: 1.26s	remaining: 5.31s
192:	learn: 0.3234390	total: 1.27s	remaining: 5.3s
193:	learn: 0.3233200	total: 1.27s	remaining: 5.3s
194:	learn: 0.3232017	total: 1.28s	remaining: 5.29s
195:	learn: 0.3230780	total: 1.29s	remaining: 5.28s
196:	learn: 0.3229267	total: 1.29s	remaining: 5.27s
197:	learn: 0.3228182	total: 1.3s	remaining: 5.26s
198:	learn: 0.3226926	total: 1.31s	remaining: 5.26s
199:	learn: 0.3225897	total: 1.31s	remaining: 5.25s
200:	learn: 0.3224276	total: 1.32s	remaining: 5.25s
201:	learn: 0.3222463	total: 1.32s	remaining: 5.24s
202:	learn: 0.3221022	total: 1.33s	remaining: 5.23s
203:	learn: 0.3219966	total: 1.34s	remaining: 5.22s
204:	learn: 0.3218942	total: 1.34s	remaining: 5.21s
205:	learn: 0.3217778	total: 1.35s	remaining: 5.21s
206:	learn: 0.3216405	total: 1.36s	remaining: 5.2s
207:	learn: 0.3215358	total: 1.36s	remaining: 5.18s
208:	learn: 0.3214390	total: 1.37s	remaining: 5.17s
209:	learn: 0.3213259	total: 1.37s	remaining: 5.16s
210:	learn: 0.3212500	total: 1.38s	remaining: 5.15s
211:	learn: 0.3211321	total: 1.38s	remaining: 5.14s
212:	learn: 0.3210528	total: 1.39s	remaining: 5.13s
213:	learn: 0.3209673	total: 1.39s	remaining: 5.12s
214:	learn: 0.3208611	total: 1.4s	remaining: 5.11s
215:	learn: 0.3207601	total: 1.41s	remaining: 5.1s
216:	learn: 0.3206057	total: 1.41s	remaining: 5.09s
217:	learn: 0.3204887	total: 1.42s	remaining: 5.09s
218:	learn: 0.3204072	total: 1.42s	remaining: 5.08s
219:	learn: 0.3203106	total: 1.43s	remaining: 5.07s
220:	learn: 0.3201164	total: 1.44s	remaining: 5.06s
221:	learn: 0.3200247	total: 1.44s	remaining: 5.05s
222:	learn: 0.3199398	total: 1.45s	remaining: 5.04s
223:	learn: 0.3197890	total: 1.45s	remaining: 5.04s
224:	learn: 0.3197106	total: 1.46s	remaining: 5.03s
225:	learn: 0.3195793	total: 1.47s	remaining: 5.02s
226:	learn: 0.3194713	total: 1.47s	remaining: 5.01s
227:	learn: 0.3193704	total: 1.48s	remaining: 5.01s
228:	learn: 0.3192623	total: 1.49s	remaining: 5.01s
229:	learn: 0.3191886	total: 1.49s	remaining: 5s
230:	learn: 0.3190990	total: 1.5s	remaining: 5s
231:	learn: 0.3190209	total: 1.51s	remaining: 5s
232:	learn: 0.3189334	total: 1.52s	remaining: 5s
233:	learn: 0.3188437	total: 1.52s	remaining: 4.99s
234:	learn: 0.3187387	total: 1.53s	remaining: 4.99s
235:	learn: 0.3186157	total: 1.54s	remaining: 4.99s
236:	learn: 0.3185125	total: 1.55s	remaining: 4.99s
237:	learn: 0.3184267	total: 1.56s	remaining: 4.98s
238:	learn: 0.3183360	total: 1.56s	remaining: 4.98s
239:	learn: 0.3182384	total: 1.57s	remaining: 4.98s
240:	learn: 0.3181280	total: 1.58s	remaining: 4.97s
241:	learn: 0.3180313	total: 1.58s	remaining: 4.97s
242:	learn: 0.3179220	total: 1.59s	remaining: 4.96s
243:	learn: 0.3178223	total: 1.6s	remaining: 4.96s
244:	learn: 0.3177352	total: 1.61s	remaining: 4.96s
245:	learn: 0.3176311	total: 1.62s	remaining: 4.96s
246:	learn: 0.3175168	total: 1.62s	remaining: 4.95s
247:	learn: 0.3174054	total: 1.63s	remaining: 4.95s
248:	learn: 0.3173062	total: 1.64s	remaining: 4.95s
249:	learn: 0.3172235	total: 1.65s	remaining: 4.94s
250:	learn: 0.3171371	total: 1.66s	remaining: 4.94s
251:	learn: 0.3169135	total: 1.66s	remaining: 4.93s
252:	learn: 0.3168064	total: 1.67s	remaining: 4.93s
253:	learn: 0.3166422	total: 1.68s	remaining: 4.93s
254:	learn: 0.3165559	total: 1.69s	remaining: 4.92s
255:	learn: 0.3164176	total: 1.69s	remaining: 4.92s
256:	learn: 0.3163678	total: 1.7s	remaining: 4.92s
257:	learn: 0.3162848	total: 1.71s	remaining: 4.91s
258:	learn: 0.3161943	total: 1.72s	remaining: 4.91s
259:	learn: 0.3160685	total: 1.72s	remaining: 4.91s
260:	learn: 0.3159265	total: 1.73s	remaining: 4.9s
261:	learn: 0.3158614	total: 1.74s	remaining: 4.9s
262:	learn: 0.3157958	total: 1.75s	remaining: 4.9s
263:	learn: 0.3157160	total: 1.75s	remaining: 4.89s
264:	learn: 0.3156617	total: 1.76s	remaining: 4.89s
265:	learn: 0.3155741	total: 1.77s	remaining: 4.89s
266:	learn: 0.3154866	total: 1.78s	remaining: 4.89s
267:	learn: 0.3153685	total: 1.79s	remaining: 4.88s
268:	learn: 0.3152605	total: 1.8s	remaining: 4.88s
269:	learn: 0.3151912	total: 1.81s	remaining: 4.88s
270:	learn: 0.3150594	total: 1.81s	remaining: 4.88s
271:	learn: 0.3149812	total: 1.82s	remaining: 4.87s
272:	learn: 0.3148994	total: 1.83s	remaining: 4.86s
273:	learn: 0.3148175	total: 1.83s	remaining: 4.86s
274:	learn: 0.3147113	total: 1.84s	remaining: 4.85s
275:	learn: 0.3146340	total: 1.85s	remaining: 4.84s
276:	learn: 0.3145347	total: 1.85s	remaining: 4.83s
277:	learn: 0.3144011	total: 1.86s	remaining: 4.83s
278:	learn: 0.3143209	total: 1.86s	remaining: 4.82s
279:	learn: 0.3142279	total: 1.87s	remaining: 4.81s
280:	learn: 0.3141271	total: 1.88s	remaining: 4.81s
281:	learn: 0.3140596	total: 1.89s	remaining: 4.8s
282:	learn: 0.3139793	total: 1.89s	remaining: 4.79s
283:	learn: 0.3139072	total: 1.9s	remaining: 4.79s
284:	learn: 0.3138176	total: 1.9s	remaining: 4.78s
285:	learn: 0.3137064	total: 1.91s	remaining: 4.77s
286:	learn: 0.3136055	total: 1.92s	remaining: 4.76s
287:	learn: 0.3134983	total: 1.92s	remaining: 4.76s
288:	learn: 0.3134082	total: 1.93s	remaining: 4.75s
289:	learn: 0.3133389	total: 1.94s	remaining: 4.74s
290:	learn: 0.3132640	total: 1.94s	remaining: 4.74s
291:	learn: 0.3131843	total: 1.95s	remaining: 4.73s
292:	learn: 0.3130759	total: 1.96s	remaining: 4.72s
293:	learn: 0.3130059	total: 1.96s	remaining: 4.71s
294:	learn: 0.3128940	total: 1.97s	remaining: 4.71s
295:	learn: 0.3127934	total: 1.98s	remaining: 4.7s
296:	learn: 0.3127313	total: 1.98s	remaining: 4.69s
297:	learn: 0.3126504	total: 1.99s	remaining: 4.69s
298:	learn: 0.3125442	total: 2s	remaining: 4.68s
299:	learn: 0.3124493	total: 2s	remaining: 4.67s
300:	learn: 0.3123693	total: 2.01s	remaining: 4.66s
301:	learn: 0.3121828	total: 2.02s	remaining: 4.66s
302:	learn: 0.3120550	total: 2.02s	remaining: 4.65s
303:	learn: 0.3119955	total: 2.03s	remaining: 4.64s
304:	learn: 0.3119049	total: 2.03s	remaining: 4.63s
305:	learn: 0.3118277	total: 2.04s	remaining: 4.63s
306:	learn: 0.3117440	total: 2.05s	remaining: 4.62s
307:	learn: 0.3116536	total: 2.05s	remaining: 4.61s
308:	learn: 0.3115848	total: 2.06s	remaining: 4.61s
309:	learn: 0.3115012	total: 2.06s	remaining: 4.6s
310:	learn: 0.3113823	total: 2.07s	remaining: 4.59s
311:	learn: 0.3111763	total: 2.08s	remaining: 4.58s
312:	learn: 0.3110936	total: 2.08s	remaining: 4.58s
313:	learn: 0.3110286	total: 2.09s	remaining: 4.57s
314:	learn: 0.3109251	total: 2.1s	remaining: 4.56s
315:	learn: 0.3108434	total: 2.1s	remaining: 4.55s
316:	learn: 0.3107548	total: 2.11s	remaining: 4.54s
317:	learn: 0.3106742	total: 2.12s	remaining: 4.54s
318:	learn: 0.3106107	total: 2.12s	remaining: 4.53s
319:	learn: 0.3104713	total: 2.13s	remaining: 4.52s
320:	learn: 0.3103877	total: 2.13s	remaining: 4.52s
321:	learn: 0.3103061	total: 2.14s	remaining: 4.51s
322:	learn: 0.3102274	total: 2.15s	remaining: 4.5s
323:	learn: 0.3101523	total: 2.15s	remaining: 4.49s
324:	learn: 0.3100648	total: 2.16s	remaining: 4.49s
325:	learn: 0.3099816	total: 2.17s	remaining: 4.48s
326:	learn: 0.3099025	total: 2.17s	remaining: 4.47s
327:	learn: 0.3098266	total: 2.18s	remaining: 4.47s
328:	learn: 0.3097977	total: 2.19s	remaining: 4.46s
329:	learn: 0.3097153	total: 2.19s	remaining: 4.45s
330:	learn: 0.3096062	total: 2.2s	remaining: 4.45s
331:	learn: 0.3095421	total: 2.21s	remaining: 4.44s
332:	learn: 0.3094413	total: 2.21s	remaining: 4.43s
333:	learn: 0.3093862	total: 2.22s	remaining: 4.42s
334:	learn: 0.3093095	total: 2.23s	remaining: 4.42s
335:	learn: 0.3092558	total: 2.23s	remaining: 4.41s
336:	learn: 0.3091705	total: 2.24s	remaining: 4.4s
337:	learn: 0.3090876	total: 2.24s	remaining: 4.4s
338:	learn: 0.3089865	total: 2.25s	remaining: 4.39s
339:	learn: 0.3089205	total: 2.26s	remaining: 4.38s
340:	learn: 0.3088728	total: 2.27s	remaining: 4.38s
341:	learn: 0.3088203	total: 2.27s	remaining: 4.38s
342:	learn: 0.3087177	total: 2.28s	remaining: 4.37s
343:	learn: 0.3086875	total: 2.29s	remaining: 4.37s
344:	learn: 0.3086127	total: 2.3s	remaining: 4.36s
345:	learn: 0.3085315	total: 2.3s	remaining: 4.35s
346:	learn: 0.3084558	total: 2.31s	remaining: 4.35s
347:	learn: 0.3083827	total: 2.32s	remaining: 4.34s
348:	learn: 0.3082776	total: 2.32s	remaining: 4.33s
349:	learn: 0.3081777	total: 2.33s	remaining: 4.33s
350:	learn: 0.3080719	total: 2.34s	remaining: 4.33s
351:	learn: 0.3079924	total: 2.35s	remaining: 4.32s
352:	learn: 0.3079276	total: 2.35s	remaining: 4.31s
353:	learn: 0.3078412	total: 2.36s	remaining: 4.31s
354:	learn: 0.3077620	total: 2.37s	remaining: 4.31s
355:	learn: 0.3077142	total: 2.38s	remaining: 4.3s
356:	learn: 0.3076364	total: 2.39s	remaining: 4.3s
357:	learn: 0.3075800	total: 2.39s	remaining: 4.29s
358:	learn: 0.3074941	total: 2.4s	remaining: 4.29s
359:	learn: 0.3073846	total: 2.41s	remaining: 4.28s
360:	learn: 0.3073161	total: 2.41s	remaining: 4.27s
361:	learn: 0.3072841	total: 2.42s	remaining: 4.27s
362:	learn: 0.3071911	total: 2.43s	remaining: 4.26s
363:	learn: 0.3071149	total: 2.44s	remaining: 4.26s
364:	learn: 0.3070395	total: 2.44s	remaining: 4.25s
365:	learn: 0.3069593	total: 2.45s	remaining: 4.25s
366:	learn: 0.3068688	total: 2.46s	remaining: 4.24s
367:	learn: 0.3067718	total: 2.46s	remaining: 4.23s
368:	learn: 0.3066957	total: 2.47s	remaining: 4.23s
369:	learn: 0.3066251	total: 2.48s	remaining: 4.22s
370:	learn: 0.3065340	total: 2.49s	remaining: 4.21s
371:	learn: 0.3064930	total: 2.49s	remaining: 4.21s
372:	learn: 0.3064139	total: 2.5s	remaining: 4.2s
373:	learn: 0.3063298	total: 2.5s	remaining: 4.19s
374:	learn: 0.3062684	total: 2.51s	remaining: 4.19s
375:	learn: 0.3061654	total: 2.52s	remaining: 4.18s
376:	learn: 0.3060828	total: 2.52s	remaining: 4.17s
377:	learn: 0.3060152	total: 2.53s	remaining: 4.17s
378:	learn: 0.3059270	total: 2.54s	remaining: 4.16s
379:	learn: 0.3058515	total: 2.55s	remaining: 4.15s
380:	learn: 0.3057789	total: 2.55s	remaining: 4.15s
381:	learn: 0.3056934	total: 2.56s	remaining: 4.14s
382:	learn: 0.3056049	total: 2.57s	remaining: 4.14s
383:	learn: 0.3055517	total: 2.58s	remaining: 4.13s
384:	learn: 0.3054656	total: 2.58s	remaining: 4.13s
385:	learn: 0.3054150	total: 2.59s	remaining: 4.12s
386:	learn: 0.3053339	total: 2.6s	remaining: 4.12s
387:	learn: 0.3052744	total: 2.61s	remaining: 4.11s
388:	learn: 0.3052052	total: 2.61s	remaining: 4.11s
389:	learn: 0.3051331	total: 2.62s	remaining: 4.1s
390:	learn: 0.3050486	total: 2.63s	remaining: 4.09s
391:	learn: 0.3049326	total: 2.63s	remaining: 4.09s
392:	learn: 0.3048881	total: 2.64s	remaining: 4.08s
393:	learn: 0.3048336	total: 2.65s	remaining: 4.07s
394:	learn: 0.3047738	total: 2.66s	remaining: 4.07s
395:	learn: 0.3046984	total: 2.66s	remaining: 4.06s
396:	learn: 0.3046306	total: 2.67s	remaining: 4.05s
397:	learn: 0.3045657	total: 2.67s	remaining: 4.04s
398:	learn: 0.3045231	total: 2.68s	remaining: 4.04s
399:	learn: 0.3044414	total: 2.69s	remaining: 4.03s
400:	learn: 0.3043611	total: 2.69s	remaining: 4.02s
401:	learn: 0.3042876	total: 2.7s	remaining: 4.02s
402:	learn: 0.3042271	total: 2.71s	remaining: 4.01s
403:	learn: 0.3041447	total: 2.71s	remaining: 4.01s
404:	learn: 0.3040763	total: 2.72s	remaining: 4s
405:	learn: 0.3039900	total: 2.73s	remaining: 4s
406:	learn: 0.3039303	total: 2.74s	remaining: 3.99s
407:	learn: 0.3038573	total: 2.75s	remaining: 3.98s
408:	learn: 0.3037873	total: 2.75s	remaining: 3.98s
409:	learn: 0.3037372	total: 2.76s	remaining: 3.97s
410:	learn: 0.3036655	total: 2.77s	remaining: 3.96s
411:	learn: 0.3035925	total: 2.77s	remaining: 3.96s
412:	learn: 0.3035166	total: 2.78s	remaining: 3.95s
413:	learn: 0.3034549	total: 2.79s	remaining: 3.95s
414:	learn: 0.3033199	total: 2.8s	remaining: 3.94s
415:	learn: 0.3032932	total: 2.8s	remaining: 3.94s
416:	learn: 0.3032304	total: 2.81s	remaining: 3.93s
417:	learn: 0.3031627	total: 2.82s	remaining: 3.92s
418:	learn: 0.3030862	total: 2.82s	remaining: 3.92s
419:	learn: 0.3029717	total: 2.83s	remaining: 3.91s
420:	learn: 0.3028997	total: 2.84s	remaining: 3.9s
421:	learn: 0.3028084	total: 2.85s	remaining: 3.9s
422:	learn: 0.3027159	total: 2.85s	remaining: 3.89s
423:	learn: 0.3026661	total: 2.86s	remaining: 3.89s
424:	learn: 0.3026138	total: 2.87s	remaining: 3.88s
425:	learn: 0.3025372	total: 2.88s	remaining: 3.88s
426:	learn: 0.3024466	total: 2.88s	remaining: 3.87s
427:	learn: 0.3023576	total: 2.89s	remaining: 3.86s
428:	learn: 0.3022941	total: 2.9s	remaining: 3.86s
429:	learn: 0.3022257	total: 2.9s	remaining: 3.85s
430:	learn: 0.3021519	total: 2.91s	remaining: 3.84s
431:	learn: 0.3020562	total: 2.92s	remaining: 3.84s
432:	learn: 0.3019712	total: 2.92s	remaining: 3.83s
433:	learn: 0.3019047	total: 2.93s	remaining: 3.82s
434:	learn: 0.3018419	total: 2.93s	remaining: 3.81s
435:	learn: 0.3017732	total: 2.94s	remaining: 3.8s
436:	learn: 0.3016927	total: 2.94s	remaining: 3.79s
437:	learn: 0.3016271	total: 2.95s	remaining: 3.79s
438:	learn: 0.3015719	total: 2.96s	remaining: 3.78s
439:	learn: 0.3014948	total: 2.96s	remaining: 3.77s
440:	learn: 0.3013829	total: 2.97s	remaining: 3.77s
441:	learn: 0.3013242	total: 2.98s	remaining: 3.76s
442:	learn: 0.3012581	total: 2.98s	remaining: 3.75s
443:	learn: 0.3011860	total: 2.99s	remaining: 3.74s
444:	learn: 0.3011163	total: 2.99s	remaining: 3.73s
445:	learn: 0.3010641	total: 3s	remaining: 3.73s
446:	learn: 0.3009940	total: 3s	remaining: 3.72s
447:	learn: 0.3009031	total: 3.01s	remaining: 3.71s
448:	learn: 0.3008368	total: 3.02s	remaining: 3.7s
449:	learn: 0.3007945	total: 3.03s	remaining: 3.7s
450:	learn: 0.3006919	total: 3.03s	remaining: 3.69s
451:	learn: 0.3006357	total: 3.04s	remaining: 3.68s
452:	learn: 0.3005592	total: 3.04s	remaining: 3.68s
453:	learn: 0.3005001	total: 3.05s	remaining: 3.67s
454:	learn: 0.3004659	total: 3.06s	remaining: 3.66s
455:	learn: 0.3004066	total: 3.06s	remaining: 3.66s
456:	learn: 0.3003396	total: 3.07s	remaining: 3.65s
457:	learn: 0.3002882	total: 3.08s	remaining: 3.64s
458:	learn: 0.3002321	total: 3.08s	remaining: 3.63s
459:	learn: 0.3001657	total: 3.09s	remaining: 3.63s
460:	learn: 0.3001066	total: 3.1s	remaining: 3.62s
461:	learn: 0.3000604	total: 3.1s	remaining: 3.61s
462:	learn: 0.2999822	total: 3.11s	remaining: 3.61s
463:	learn: 0.2998927	total: 3.12s	remaining: 3.6s
464:	learn: 0.2998426	total: 3.12s	remaining: 3.59s
465:	learn: 0.2997573	total: 3.13s	remaining: 3.59s
466:	learn: 0.2997158	total: 3.14s	remaining: 3.58s
467:	learn: 0.2996437	total: 3.14s	remaining: 3.57s
468:	learn: 0.2996087	total: 3.15s	remaining: 3.57s
469:	learn: 0.2995227	total: 3.16s	remaining: 3.56s
470:	learn: 0.2994743	total: 3.16s	remaining: 3.55s
471:	learn: 0.2993830	total: 3.17s	remaining: 3.54s
472:	learn: 0.2993502	total: 3.17s	remaining: 3.54s
473:	learn: 0.2992664	total: 3.18s	remaining: 3.53s
474:	learn: 0.2991632	total: 3.19s	remaining: 3.52s
475:	learn: 0.2990985	total: 3.19s	remaining: 3.52s
476:	learn: 0.2990352	total: 3.2s	remaining: 3.51s
477:	learn: 0.2989686	total: 3.21s	remaining: 3.5s
478:	learn: 0.2989019	total: 3.21s	remaining: 3.5s
479:	learn: 0.2988361	total: 3.22s	remaining: 3.49s
480:	learn: 0.2987365	total: 3.23s	remaining: 3.48s
481:	learn: 0.2986945	total: 3.23s	remaining: 3.48s
482:	learn: 0.2986515	total: 3.24s	remaining: 3.47s
483:	learn: 0.2985720	total: 3.25s	remaining: 3.46s
484:	learn: 0.2985276	total: 3.25s	remaining: 3.45s
485:	learn: 0.2984553	total: 3.26s	remaining: 3.45s
486:	learn: 0.2983871	total: 3.27s	remaining: 3.44s
487:	learn: 0.2983050	total: 3.27s	remaining: 3.43s
488:	learn: 0.2982669	total: 3.28s	remaining: 3.43s
489:	learn: 0.2981804	total: 3.29s	remaining: 3.42s
490:	learn: 0.2981210	total: 3.29s	remaining: 3.41s
491:	learn: 0.2980363	total: 3.3s	remaining: 3.41s
492:	learn: 0.2979589	total: 3.31s	remaining: 3.4s
493:	learn: 0.2978975	total: 3.31s	remaining: 3.39s
494:	learn: 0.2978255	total: 3.32s	remaining: 3.39s
495:	learn: 0.2977685	total: 3.33s	remaining: 3.38s
496:	learn: 0.2976992	total: 3.33s	remaining: 3.37s
497:	learn: 0.2976271	total: 3.34s	remaining: 3.37s
498:	learn: 0.2975702	total: 3.35s	remaining: 3.36s
499:	learn: 0.2975249	total: 3.35s	remaining: 3.35s
500:	learn: 0.2974689	total: 3.36s	remaining: 3.35s
501:	learn: 0.2974052	total: 3.37s	remaining: 3.34s
502:	learn: 0.2973373	total: 3.37s	remaining: 3.33s
503:	learn: 0.2972700	total: 3.38s	remaining: 3.33s
504:	learn: 0.2971966	total: 3.38s	remaining: 3.32s
505:	learn: 0.2971277	total: 3.39s	remaining: 3.31s
506:	learn: 0.2970749	total: 3.4s	remaining: 3.3s
507:	learn: 0.2970068	total: 3.4s	remaining: 3.3s
508:	learn: 0.2969323	total: 3.41s	remaining: 3.29s
509:	learn: 0.2968682	total: 3.42s	remaining: 3.28s
510:	learn: 0.2967770	total: 3.42s	remaining: 3.28s
511:	learn: 0.2967271	total: 3.43s	remaining: 3.27s
512:	learn: 0.2966499	total: 3.44s	remaining: 3.26s
513:	learn: 0.2965624	total: 3.44s	remaining: 3.26s
514:	learn: 0.2965095	total: 3.45s	remaining: 3.25s
515:	learn: 0.2964473	total: 3.46s	remaining: 3.24s
516:	learn: 0.2963716	total: 3.46s	remaining: 3.24s
517:	learn: 0.2963216	total: 3.47s	remaining: 3.23s
518:	learn: 0.2962648	total: 3.48s	remaining: 3.22s
519:	learn: 0.2962077	total: 3.48s	remaining: 3.22s
520:	learn: 0.2961438	total: 3.49s	remaining: 3.21s
521:	learn: 0.2960802	total: 3.5s	remaining: 3.2s
522:	learn: 0.2960066	total: 3.5s	remaining: 3.2s
523:	learn: 0.2959405	total: 3.51s	remaining: 3.19s
524:	learn: 0.2958669	total: 3.52s	remaining: 3.18s
525:	learn: 0.2957958	total: 3.52s	remaining: 3.17s
526:	learn: 0.2957243	total: 3.53s	remaining: 3.17s
527:	learn: 0.2956584	total: 3.54s	remaining: 3.16s
528:	learn: 0.2955914	total: 3.54s	remaining: 3.15s
529:	learn: 0.2955075	total: 3.55s	remaining: 3.15s
530:	learn: 0.2954444	total: 3.56s	remaining: 3.14s
531:	learn: 0.2953572	total: 3.56s	remaining: 3.13s
532:	learn: 0.2953179	total: 3.57s	remaining: 3.13s
533:	learn: 0.2952447	total: 3.58s	remaining: 3.12s
534:	learn: 0.2951708	total: 3.58s	remaining: 3.11s
535:	learn: 0.2951062	total: 3.59s	remaining: 3.11s
536:	learn: 0.2950548	total: 3.59s	remaining: 3.1s
537:	learn: 0.2950023	total: 3.6s	remaining: 3.09s
538:	learn: 0.2949455	total: 3.61s	remaining: 3.08s
539:	learn: 0.2948830	total: 3.61s	remaining: 3.08s
540:	learn: 0.2948168	total: 3.62s	remaining: 3.07s
541:	learn: 0.2947539	total: 3.63s	remaining: 3.06s
542:	learn: 0.2946940	total: 3.63s	remaining: 3.06s
543:	learn: 0.2946451	total: 3.64s	remaining: 3.05s
544:	learn: 0.2945788	total: 3.65s	remaining: 3.04s
545:	learn: 0.2945398	total: 3.65s	remaining: 3.04s
546:	learn: 0.2944699	total: 3.66s	remaining: 3.03s
547:	learn: 0.2944149	total: 3.67s	remaining: 3.02s
548:	learn: 0.2943781	total: 3.67s	remaining: 3.02s
549:	learn: 0.2943072	total: 3.68s	remaining: 3.01s
550:	learn: 0.2942415	total: 3.69s	remaining: 3s
551:	learn: 0.2941522	total: 3.69s	remaining: 3s
552:	learn: 0.2940738	total: 3.7s	remaining: 2.99s
553:	learn: 0.2940151	total: 3.71s	remaining: 2.98s
554:	learn: 0.2939476	total: 3.71s	remaining: 2.98s
555:	learn: 0.2938753	total: 3.72s	remaining: 2.97s
556:	learn: 0.2938525	total: 3.72s	remaining: 2.96s
557:	learn: 0.2937887	total: 3.73s	remaining: 2.95s
558:	learn: 0.2937160	total: 3.73s	remaining: 2.94s
559:	learn: 0.2936548	total: 3.74s	remaining: 2.94s
560:	learn: 0.2936042	total: 3.75s	remaining: 2.93s
561:	learn: 0.2935376	total: 3.75s	remaining: 2.92s
562:	learn: 0.2934800	total: 3.76s	remaining: 2.92s
563:	learn: 0.2934420	total: 3.76s	remaining: 2.91s
564:	learn: 0.2933792	total: 3.77s	remaining: 2.9s
565:	learn: 0.2933357	total: 3.78s	remaining: 2.9s
566:	learn: 0.2932920	total: 3.78s	remaining: 2.89s
567:	learn: 0.2932286	total: 3.79s	remaining: 2.88s
568:	learn: 0.2931489	total: 3.8s	remaining: 2.88s
569:	learn: 0.2930885	total: 3.8s	remaining: 2.87s
570:	learn: 0.2930498	total: 3.81s	remaining: 2.86s
571:	learn: 0.2929897	total: 3.81s	remaining: 2.85s
572:	learn: 0.2929460	total: 3.82s	remaining: 2.85s
573:	learn: 0.2928786	total: 3.83s	remaining: 2.84s
574:	learn: 0.2928015	total: 3.83s	remaining: 2.83s
575:	learn: 0.2927385	total: 3.84s	remaining: 2.83s
576:	learn: 0.2926703	total: 3.85s	remaining: 2.82s
577:	learn: 0.2926045	total: 3.85s	remaining: 2.81s
578:	learn: 0.2925555	total: 3.86s	remaining: 2.8s
579:	learn: 0.2924944	total: 3.86s	remaining: 2.8s
580:	learn: 0.2924384	total: 3.87s	remaining: 2.79s
581:	learn: 0.2923769	total: 3.87s	remaining: 2.78s
582:	learn: 0.2923253	total: 3.88s	remaining: 2.77s
583:	learn: 0.2922803	total: 3.89s	remaining: 2.77s
584:	learn: 0.2921839	total: 3.89s	remaining: 2.76s
585:	learn: 0.2921311	total: 3.9s	remaining: 2.75s
586:	learn: 0.2920814	total: 3.9s	remaining: 2.75s
587:	learn: 0.2920115	total: 3.91s	remaining: 2.74s
588:	learn: 0.2919678	total: 3.91s	remaining: 2.73s
589:	learn: 0.2918823	total: 3.92s	remaining: 2.72s
590:	learn: 0.2918259	total: 3.92s	remaining: 2.71s
591:	learn: 0.2917662	total: 3.93s	remaining: 2.71s
592:	learn: 0.2916868	total: 3.94s	remaining: 2.7s
593:	learn: 0.2916352	total: 3.94s	remaining: 2.69s
594:	learn: 0.2916053	total: 3.95s	remaining: 2.69s
595:	learn: 0.2915393	total: 3.95s	remaining: 2.68s
596:	learn: 0.2914709	total: 3.96s	remaining: 2.67s
597:	learn: 0.2914253	total: 3.97s	remaining: 2.67s
598:	learn: 0.2913663	total: 3.97s	remaining: 2.66s
599:	learn: 0.2912906	total: 3.98s	remaining: 2.65s
600:	learn: 0.2912317	total: 3.98s	remaining: 2.64s
601:	learn: 0.2911680	total: 3.99s	remaining: 2.64s
602:	learn: 0.2910925	total: 3.99s	remaining: 2.63s
603:	learn: 0.2910643	total: 4s	remaining: 2.62s
604:	learn: 0.2910047	total: 4.01s	remaining: 2.62s
605:	learn: 0.2909585	total: 4.01s	remaining: 2.61s
606:	learn: 0.2909044	total: 4.02s	remaining: 2.6s
607:	learn: 0.2908639	total: 4.02s	remaining: 2.59s
608:	learn: 0.2908121	total: 4.03s	remaining: 2.59s
609:	learn: 0.2907644	total: 4.03s	remaining: 2.58s
610:	learn: 0.2907071	total: 4.04s	remaining: 2.57s
611:	learn: 0.2906477	total: 4.04s	remaining: 2.56s
612:	learn: 0.2905880	total: 4.05s	remaining: 2.56s
613:	learn: 0.2905484	total: 4.05s	remaining: 2.55s
614:	learn: 0.2904793	total: 4.06s	remaining: 2.54s
615:	learn: 0.2904270	total: 4.07s	remaining: 2.54s
616:	learn: 0.2903873	total: 4.07s	remaining: 2.53s
617:	learn: 0.2903094	total: 4.08s	remaining: 2.52s
618:	learn: 0.2902467	total: 4.08s	remaining: 2.51s
619:	learn: 0.2901829	total: 4.09s	remaining: 2.5s
620:	learn: 0.2901043	total: 4.09s	remaining: 2.5s
621:	learn: 0.2900336	total: 4.1s	remaining: 2.49s
622:	learn: 0.2899615	total: 4.1s	remaining: 2.48s
623:	learn: 0.2899187	total: 4.11s	remaining: 2.48s
624:	learn: 0.2898803	total: 4.11s	remaining: 2.47s
625:	learn: 0.2898195	total: 4.12s	remaining: 2.46s
626:	learn: 0.2897932	total: 4.13s	remaining: 2.45s
627:	learn: 0.2897137	total: 4.13s	remaining: 2.45s
628:	learn: 0.2896570	total: 4.14s	remaining: 2.44s
629:	learn: 0.2895952	total: 4.14s	remaining: 2.43s
630:	learn: 0.2895399	total: 4.15s	remaining: 2.42s
631:	learn: 0.2894711	total: 4.15s	remaining: 2.42s
632:	learn: 0.2894455	total: 4.16s	remaining: 2.41s
633:	learn: 0.2893685	total: 4.16s	remaining: 2.4s
634:	learn: 0.2892991	total: 4.17s	remaining: 2.4s
635:	learn: 0.2892322	total: 4.17s	remaining: 2.39s
636:	learn: 0.2891851	total: 4.18s	remaining: 2.38s
637:	learn: 0.2891333	total: 4.18s	remaining: 2.38s
638:	learn: 0.2890517	total: 4.19s	remaining: 2.37s
639:	learn: 0.2889806	total: 4.2s	remaining: 2.36s
640:	learn: 0.2889203	total: 4.2s	remaining: 2.35s
641:	learn: 0.2888542	total: 4.21s	remaining: 2.35s
642:	learn: 0.2888009	total: 4.21s	remaining: 2.34s
643:	learn: 0.2887589	total: 4.22s	remaining: 2.33s
644:	learn: 0.2886962	total: 4.22s	remaining: 2.32s
645:	learn: 0.2886524	total: 4.23s	remaining: 2.32s
646:	learn: 0.2885690	total: 4.23s	remaining: 2.31s
647:	learn: 0.2885346	total: 4.24s	remaining: 2.3s
648:	learn: 0.2884784	total: 4.24s	remaining: 2.29s
649:	learn: 0.2884280	total: 4.25s	remaining: 2.29s
650:	learn: 0.2883820	total: 4.25s	remaining: 2.28s
651:	learn: 0.2883184	total: 4.26s	remaining: 2.27s
652:	learn: 0.2882609	total: 4.26s	remaining: 2.27s
653:	learn: 0.2881956	total: 4.27s	remaining: 2.26s
654:	learn: 0.2881377	total: 4.28s	remaining: 2.25s
655:	learn: 0.2880665	total: 4.28s	remaining: 2.24s
656:	learn: 0.2880283	total: 4.29s	remaining: 2.24s
657:	learn: 0.2879867	total: 4.29s	remaining: 2.23s
658:	learn: 0.2879269	total: 4.29s	remaining: 2.22s
659:	learn: 0.2878907	total: 4.3s	remaining: 2.21s
660:	learn: 0.2878130	total: 4.31s	remaining: 2.21s
661:	learn: 0.2877691	total: 4.31s	remaining: 2.2s
662:	learn: 0.2877338	total: 4.32s	remaining: 2.19s
663:	learn: 0.2876540	total: 4.32s	remaining: 2.19s
664:	learn: 0.2876219	total: 4.33s	remaining: 2.18s
665:	learn: 0.2875589	total: 4.33s	remaining: 2.17s
666:	learn: 0.2875039	total: 4.34s	remaining: 2.17s
667:	learn: 0.2874514	total: 4.34s	remaining: 2.16s
668:	learn: 0.2873922	total: 4.35s	remaining: 2.15s
669:	learn: 0.2873354	total: 4.36s	remaining: 2.15s
670:	learn: 0.2872853	total: 4.36s	remaining: 2.14s
671:	learn: 0.2872349	total: 4.37s	remaining: 2.13s
672:	learn: 0.2871911	total: 4.37s	remaining: 2.12s
673:	learn: 0.2871310	total: 4.38s	remaining: 2.12s
674:	learn: 0.2870970	total: 4.38s	remaining: 2.11s
675:	learn: 0.2870615	total: 4.39s	remaining: 2.1s
676:	learn: 0.2870172	total: 4.4s	remaining: 2.1s
677:	learn: 0.2869476	total: 4.4s	remaining: 2.09s
678:	learn: 0.2868897	total: 4.41s	remaining: 2.08s
679:	learn: 0.2868302	total: 4.42s	remaining: 2.08s
680:	learn: 0.2867671	total: 4.42s	remaining: 2.07s
681:	learn: 0.2867086	total: 4.43s	remaining: 2.06s
682:	learn: 0.2866710	total: 4.43s	remaining: 2.06s
683:	learn: 0.2866293	total: 4.44s	remaining: 2.05s
684:	learn: 0.2865485	total: 4.45s	remaining: 2.04s
685:	learn: 0.2864714	total: 4.45s	remaining: 2.04s
686:	learn: 0.2864023	total: 4.46s	remaining: 2.03s
687:	learn: 0.2863491	total: 4.47s	remaining: 2.02s
688:	learn: 0.2862775	total: 4.47s	remaining: 2.02s
689:	learn: 0.2862181	total: 4.48s	remaining: 2.01s
690:	learn: 0.2861400	total: 4.49s	remaining: 2.01s
691:	learn: 0.2860783	total: 4.49s	remaining: 2s
692:	learn: 0.2860037	total: 4.5s	remaining: 1.99s
693:	learn: 0.2859663	total: 4.5s	remaining: 1.99s
694:	learn: 0.2859045	total: 4.51s	remaining: 1.98s
695:	learn: 0.2858500	total: 4.52s	remaining: 1.97s
696:	learn: 0.2858019	total: 4.52s	remaining: 1.97s
697:	learn: 0.2857538	total: 4.53s	remaining: 1.96s
698:	learn: 0.2856967	total: 4.54s	remaining: 1.95s
699:	learn: 0.2856509	total: 4.54s	remaining: 1.95s
700:	learn: 0.2855792	total: 4.55s	remaining: 1.94s
701:	learn: 0.2855321	total: 4.55s	remaining: 1.93s
702:	learn: 0.2854739	total: 4.56s	remaining: 1.93s
703:	learn: 0.2854226	total: 4.57s	remaining: 1.92s
704:	learn: 0.2853877	total: 4.58s	remaining: 1.91s
705:	learn: 0.2853094	total: 4.58s	remaining: 1.91s
706:	learn: 0.2852829	total: 4.59s	remaining: 1.9s
707:	learn: 0.2852059	total: 4.59s	remaining: 1.89s
708:	learn: 0.2851367	total: 4.6s	remaining: 1.89s
709:	learn: 0.2850881	total: 4.6s	remaining: 1.88s
710:	learn: 0.2850495	total: 4.61s	remaining: 1.87s
711:	learn: 0.2849758	total: 4.61s	remaining: 1.87s
712:	learn: 0.2849404	total: 4.62s	remaining: 1.86s
713:	learn: 0.2848727	total: 4.63s	remaining: 1.85s
714:	learn: 0.2848324	total: 4.63s	remaining: 1.84s
715:	learn: 0.2847763	total: 4.64s	remaining: 1.84s
716:	learn: 0.2847388	total: 4.64s	remaining: 1.83s
717:	learn: 0.2846783	total: 4.65s	remaining: 1.82s
718:	learn: 0.2846164	total: 4.65s	remaining: 1.82s
719:	learn: 0.2845971	total: 4.66s	remaining: 1.81s
720:	learn: 0.2845522	total: 4.66s	remaining: 1.8s
721:	learn: 0.2845038	total: 4.67s	remaining: 1.8s
722:	learn: 0.2844328	total: 4.67s	remaining: 1.79s
723:	learn: 0.2843865	total: 4.68s	remaining: 1.78s
724:	learn: 0.2843357	total: 4.68s	remaining: 1.78s
725:	learn: 0.2842929	total: 4.69s	remaining: 1.77s
726:	learn: 0.2842468	total: 4.7s	remaining: 1.76s
727:	learn: 0.2842050	total: 4.7s	remaining: 1.76s
728:	learn: 0.2841615	total: 4.71s	remaining: 1.75s
729:	learn: 0.2841023	total: 4.72s	remaining: 1.74s
730:	learn: 0.2840415	total: 4.72s	remaining: 1.74s
731:	learn: 0.2839823	total: 4.73s	remaining: 1.73s
732:	learn: 0.2839363	total: 4.74s	remaining: 1.73s
733:	learn: 0.2839080	total: 4.74s	remaining: 1.72s
734:	learn: 0.2838606	total: 4.75s	remaining: 1.71s
735:	learn: 0.2838012	total: 4.76s	remaining: 1.71s
736:	learn: 0.2837440	total: 4.76s	remaining: 1.7s
737:	learn: 0.2836950	total: 4.77s	remaining: 1.69s
738:	learn: 0.2836804	total: 4.78s	remaining: 1.69s
739:	learn: 0.2836259	total: 4.78s	remaining: 1.68s
740:	learn: 0.2835712	total: 4.79s	remaining: 1.67s
741:	learn: 0.2835323	total: 4.8s	remaining: 1.67s
742:	learn: 0.2834652	total: 4.8s	remaining: 1.66s
743:	learn: 0.2834356	total: 4.81s	remaining: 1.65s
744:	learn: 0.2833888	total: 4.82s	remaining: 1.65s
745:	learn: 0.2833488	total: 4.82s	remaining: 1.64s
746:	learn: 0.2832882	total: 4.83s	remaining: 1.64s
747:	learn: 0.2832456	total: 4.83s	remaining: 1.63s
748:	learn: 0.2831941	total: 4.84s	remaining: 1.62s
749:	learn: 0.2831254	total: 4.85s	remaining: 1.62s
750:	learn: 0.2830640	total: 4.85s	remaining: 1.61s
751:	learn: 0.2830067	total: 4.86s	remaining: 1.6s
752:	learn: 0.2829589	total: 4.87s	remaining: 1.6s
753:	learn: 0.2829040	total: 4.87s	remaining: 1.59s
754:	learn: 0.2828476	total: 4.88s	remaining: 1.58s
755:	learn: 0.2827706	total: 4.89s	remaining: 1.58s
756:	learn: 0.2827149	total: 4.89s	remaining: 1.57s
757:	learn: 0.2826553	total: 4.9s	remaining: 1.56s
758:	learn: 0.2826099	total: 4.91s	remaining: 1.56s
759:	learn: 0.2825358	total: 4.91s	remaining: 1.55s
760:	learn: 0.2824898	total: 4.92s	remaining: 1.54s
761:	learn: 0.2824414	total: 4.92s	remaining: 1.54s
762:	learn: 0.2824118	total: 4.93s	remaining: 1.53s
763:	learn: 0.2823562	total: 4.93s	remaining: 1.52s
764:	learn: 0.2823105	total: 4.94s	remaining: 1.52s
765:	learn: 0.2822572	total: 4.94s	remaining: 1.51s
766:	learn: 0.2821934	total: 4.95s	remaining: 1.5s
767:	learn: 0.2821413	total: 4.96s	remaining: 1.5s
768:	learn: 0.2820946	total: 4.96s	remaining: 1.49s
769:	learn: 0.2820280	total: 4.97s	remaining: 1.48s
770:	learn: 0.2820002	total: 4.97s	remaining: 1.48s
771:	learn: 0.2819410	total: 4.98s	remaining: 1.47s
772:	learn: 0.2818920	total: 4.98s	remaining: 1.46s
773:	learn: 0.2818421	total: 4.99s	remaining: 1.46s
774:	learn: 0.2818046	total: 4.99s	remaining: 1.45s
775:	learn: 0.2817531	total: 5s	remaining: 1.44s
776:	learn: 0.2817024	total: 5s	remaining: 1.44s
777:	learn: 0.2816546	total: 5.01s	remaining: 1.43s
778:	learn: 0.2816215	total: 5.01s	remaining: 1.42s
779:	learn: 0.2815803	total: 5.02s	remaining: 1.42s
780:	learn: 0.2815502	total: 5.03s	remaining: 1.41s
781:	learn: 0.2814987	total: 5.03s	remaining: 1.4s
782:	learn: 0.2814729	total: 5.04s	remaining: 1.4s
783:	learn: 0.2814296	total: 5.04s	remaining: 1.39s
784:	learn: 0.2813731	total: 5.04s	remaining: 1.38s
785:	learn: 0.2813213	total: 5.05s	remaining: 1.38s
786:	learn: 0.2812796	total: 5.06s	remaining: 1.37s
787:	learn: 0.2812416	total: 5.06s	remaining: 1.36s
788:	learn: 0.2812075	total: 5.07s	remaining: 1.35s
789:	learn: 0.2811684	total: 5.07s	remaining: 1.35s
790:	learn: 0.2811218	total: 5.08s	remaining: 1.34s
791:	learn: 0.2810697	total: 5.08s	remaining: 1.33s
792:	learn: 0.2810214	total: 5.09s	remaining: 1.33s
793:	learn: 0.2809578	total: 5.09s	remaining: 1.32s
794:	learn: 0.2809034	total: 5.1s	remaining: 1.31s
795:	learn: 0.2808564	total: 5.1s	remaining: 1.31s
796:	learn: 0.2808141	total: 5.11s	remaining: 1.3s
797:	learn: 0.2807628	total: 5.12s	remaining: 1.29s
798:	learn: 0.2807134	total: 5.12s	remaining: 1.29s
799:	learn: 0.2806448	total: 5.13s	remaining: 1.28s
800:	learn: 0.2806258	total: 5.13s	remaining: 1.27s
801:	learn: 0.2805733	total: 5.14s	remaining: 1.27s
802:	learn: 0.2805379	total: 5.14s	remaining: 1.26s
803:	learn: 0.2804951	total: 5.15s	remaining: 1.25s
804:	learn: 0.2804287	total: 5.15s	remaining: 1.25s
805:	learn: 0.2803619	total: 5.16s	remaining: 1.24s
806:	learn: 0.2803103	total: 5.16s	remaining: 1.23s
807:	learn: 0.2802664	total: 5.17s	remaining: 1.23s
808:	learn: 0.2802004	total: 5.17s	remaining: 1.22s
809:	learn: 0.2801528	total: 5.18s	remaining: 1.21s
810:	learn: 0.2801046	total: 5.18s	remaining: 1.21s
811:	learn: 0.2800448	total: 5.19s	remaining: 1.2s
812:	learn: 0.2799874	total: 5.19s	remaining: 1.19s
813:	learn: 0.2799375	total: 5.2s	remaining: 1.19s
814:	learn: 0.2798962	total: 5.21s	remaining: 1.18s
815:	learn: 0.2798257	total: 5.21s	remaining: 1.18s
816:	learn: 0.2797825	total: 5.22s	remaining: 1.17s
817:	learn: 0.2797303	total: 5.22s	remaining: 1.16s
818:	learn: 0.2796782	total: 5.23s	remaining: 1.16s
819:	learn: 0.2796212	total: 5.24s	remaining: 1.15s
820:	learn: 0.2795770	total: 5.24s	remaining: 1.14s
821:	learn: 0.2795303	total: 5.25s	remaining: 1.14s
822:	learn: 0.2794942	total: 5.25s	remaining: 1.13s
823:	learn: 0.2794281	total: 5.26s	remaining: 1.12s
824:	learn: 0.2793949	total: 5.26s	remaining: 1.12s
825:	learn: 0.2793604	total: 5.27s	remaining: 1.11s
826:	learn: 0.2793185	total: 5.27s	remaining: 1.1s
827:	learn: 0.2792556	total: 5.28s	remaining: 1.1s
828:	learn: 0.2792163	total: 5.28s	remaining: 1.09s
829:	learn: 0.2791728	total: 5.29s	remaining: 1.08s
830:	learn: 0.2791367	total: 5.29s	remaining: 1.08s
831:	learn: 0.2790922	total: 5.3s	remaining: 1.07s
832:	learn: 0.2790548	total: 5.3s	remaining: 1.06s
833:	learn: 0.2790061	total: 5.31s	remaining: 1.06s
834:	learn: 0.2789555	total: 5.31s	remaining: 1.05s
835:	learn: 0.2788837	total: 5.32s	remaining: 1.04s
836:	learn: 0.2788577	total: 5.32s	remaining: 1.04s
837:	learn: 0.2788117	total: 5.33s	remaining: 1.03s
838:	learn: 0.2787740	total: 5.33s	remaining: 1.02s
839:	learn: 0.2787317	total: 5.34s	remaining: 1.02s
840:	learn: 0.2786883	total: 5.35s	remaining: 1.01s
841:	learn: 0.2786326	total: 5.35s	remaining: 1s
842:	learn: 0.2785952	total: 5.36s	remaining: 998ms
843:	learn: 0.2785286	total: 5.36s	remaining: 991ms
844:	learn: 0.2784978	total: 5.37s	remaining: 985ms
845:	learn: 0.2784343	total: 5.37s	remaining: 978ms
846:	learn: 0.2783737	total: 5.38s	remaining: 971ms
847:	learn: 0.2783377	total: 5.38s	remaining: 965ms
848:	learn: 0.2782826	total: 5.39s	remaining: 958ms
849:	learn: 0.2782436	total: 5.39s	remaining: 952ms
850:	learn: 0.2782094	total: 5.4s	remaining: 945ms
851:	learn: 0.2781513	total: 5.4s	remaining: 939ms
852:	learn: 0.2781185	total: 5.41s	remaining: 932ms
853:	learn: 0.2780741	total: 5.41s	remaining: 926ms
854:	learn: 0.2780218	total: 5.42s	remaining: 919ms
855:	learn: 0.2779781	total: 5.43s	remaining: 913ms
856:	learn: 0.2779234	total: 5.43s	remaining: 907ms
857:	learn: 0.2778739	total: 5.44s	remaining: 900ms
858:	learn: 0.2777982	total: 5.44s	remaining: 894ms
859:	learn: 0.2777419	total: 5.45s	remaining: 887ms
860:	learn: 0.2777071	total: 5.45s	remaining: 881ms
861:	learn: 0.2776554	total: 5.46s	remaining: 874ms
862:	learn: 0.2776018	total: 5.46s	remaining: 868ms
863:	learn: 0.2775729	total: 5.47s	remaining: 861ms
864:	learn: 0.2775189	total: 5.48s	remaining: 855ms
865:	learn: 0.2774648	total: 5.48s	remaining: 848ms
866:	learn: 0.2773940	total: 5.49s	remaining: 842ms
867:	learn: 0.2773155	total: 5.49s	remaining: 836ms
868:	learn: 0.2772593	total: 5.5s	remaining: 829ms
869:	learn: 0.2772272	total: 5.51s	remaining: 823ms
870:	learn: 0.2771947	total: 5.51s	remaining: 817ms
871:	learn: 0.2771512	total: 5.52s	remaining: 810ms
872:	learn: 0.2770997	total: 5.53s	remaining: 804ms
873:	learn: 0.2770486	total: 5.53s	remaining: 798ms
874:	learn: 0.2770170	total: 5.54s	remaining: 792ms
875:	learn: 0.2769652	total: 5.55s	remaining: 786ms
876:	learn: 0.2769254	total: 5.56s	remaining: 780ms
877:	learn: 0.2768717	total: 5.57s	remaining: 774ms
878:	learn: 0.2768012	total: 5.58s	remaining: 768ms
879:	learn: 0.2767656	total: 5.59s	remaining: 762ms
880:	learn: 0.2767270	total: 5.59s	remaining: 756ms
881:	learn: 0.2766735	total: 5.6s	remaining: 749ms
882:	learn: 0.2766224	total: 5.61s	remaining: 743ms
883:	learn: 0.2765945	total: 5.62s	remaining: 737ms
884:	learn: 0.2765633	total: 5.62s	remaining: 731ms
885:	learn: 0.2765125	total: 5.63s	remaining: 725ms
886:	learn: 0.2764532	total: 5.64s	remaining: 718ms
887:	learn: 0.2764030	total: 5.64s	remaining: 712ms
888:	learn: 0.2763596	total: 5.65s	remaining: 706ms
889:	learn: 0.2763176	total: 5.66s	remaining: 700ms
890:	learn: 0.2762822	total: 5.67s	remaining: 693ms
891:	learn: 0.2762357	total: 5.67s	remaining: 687ms
892:	learn: 0.2761920	total: 5.68s	remaining: 681ms
893:	learn: 0.2761350	total: 5.69s	remaining: 675ms
894:	learn: 0.2760806	total: 5.7s	remaining: 668ms
895:	learn: 0.2759995	total: 5.71s	remaining: 662ms
896:	learn: 0.2759422	total: 5.71s	remaining: 656ms
897:	learn: 0.2758692	total: 5.72s	remaining: 650ms
898:	learn: 0.2758332	total: 5.73s	remaining: 644ms
899:	learn: 0.2757857	total: 5.74s	remaining: 637ms
900:	learn: 0.2757102	total: 5.74s	remaining: 631ms
901:	learn: 0.2756682	total: 5.75s	remaining: 625ms
902:	learn: 0.2756321	total: 5.76s	remaining: 619ms
903:	learn: 0.2755710	total: 5.77s	remaining: 612ms
904:	learn: 0.2755295	total: 5.77s	remaining: 606ms
905:	learn: 0.2754837	total: 5.78s	remaining: 600ms
906:	learn: 0.2754353	total: 5.79s	remaining: 594ms
907:	learn: 0.2753787	total: 5.8s	remaining: 587ms
908:	learn: 0.2753371	total: 5.81s	remaining: 582ms
909:	learn: 0.2752960	total: 5.83s	remaining: 577ms
910:	learn: 0.2752508	total: 5.84s	remaining: 571ms
911:	learn: 0.2751954	total: 5.85s	remaining: 565ms
912:	learn: 0.2751415	total: 5.86s	remaining: 558ms
913:	learn: 0.2750919	total: 5.87s	remaining: 552ms
914:	learn: 0.2750428	total: 5.88s	remaining: 546ms
915:	learn: 0.2750001	total: 5.88s	remaining: 540ms
916:	learn: 0.2749704	total: 5.89s	remaining: 533ms
917:	learn: 0.2749351	total: 5.9s	remaining: 527ms
918:	learn: 0.2748929	total: 5.91s	remaining: 521ms
919:	learn: 0.2748520	total: 5.91s	remaining: 514ms
920:	learn: 0.2748167	total: 5.92s	remaining: 508ms
921:	learn: 0.2747942	total: 5.93s	remaining: 502ms
922:	learn: 0.2747483	total: 5.94s	remaining: 495ms
923:	learn: 0.2747094	total: 5.94s	remaining: 489ms
924:	learn: 0.2746758	total: 5.95s	remaining: 482ms
925:	learn: 0.2746035	total: 5.96s	remaining: 476ms
926:	learn: 0.2745337	total: 5.97s	remaining: 470ms
927:	learn: 0.2745100	total: 5.98s	remaining: 464ms
928:	learn: 0.2744705	total: 5.99s	remaining: 458ms
929:	learn: 0.2744308	total: 6s	remaining: 451ms
930:	learn: 0.2743592	total: 6s	remaining: 445ms
931:	learn: 0.2743011	total: 6.01s	remaining: 439ms
932:	learn: 0.2742548	total: 6.02s	remaining: 432ms
933:	learn: 0.2741874	total: 6.02s	remaining: 426ms
934:	learn: 0.2741408	total: 6.03s	remaining: 419ms
935:	learn: 0.2740893	total: 6.04s	remaining: 413ms
936:	learn: 0.2740370	total: 6.04s	remaining: 406ms
937:	learn: 0.2740046	total: 6.05s	remaining: 400ms
938:	learn: 0.2739697	total: 6.06s	remaining: 394ms
939:	learn: 0.2739175	total: 6.07s	remaining: 387ms
940:	learn: 0.2738731	total: 6.07s	remaining: 381ms
941:	learn: 0.2738275	total: 6.08s	remaining: 374ms
942:	learn: 0.2737791	total: 6.09s	remaining: 368ms
943:	learn: 0.2737385	total: 6.09s	remaining: 362ms
944:	learn: 0.2736965	total: 6.1s	remaining: 355ms
945:	learn: 0.2736740	total: 6.11s	remaining: 349ms
946:	learn: 0.2736283	total: 6.12s	remaining: 342ms
947:	learn: 0.2735910	total: 6.12s	remaining: 336ms
948:	learn: 0.2735303	total: 6.13s	remaining: 329ms
949:	learn: 0.2734890	total: 6.14s	remaining: 323ms
950:	learn: 0.2734553	total: 6.14s	remaining: 317ms
951:	learn: 0.2734020	total: 6.15s	remaining: 310ms
952:	learn: 0.2733621	total: 6.16s	remaining: 304ms
953:	learn: 0.2733055	total: 6.17s	remaining: 297ms
954:	learn: 0.2732576	total: 6.17s	remaining: 291ms
955:	learn: 0.2732132	total: 6.18s	remaining: 284ms
956:	learn: 0.2731748	total: 6.19s	remaining: 278ms
957:	learn: 0.2731151	total: 6.19s	remaining: 272ms
958:	learn: 0.2730785	total: 6.2s	remaining: 265ms
959:	learn: 0.2730430	total: 6.21s	remaining: 259ms
960:	learn: 0.2730003	total: 6.21s	remaining: 252ms
961:	learn: 0.2729526	total: 6.22s	remaining: 246ms
962:	learn: 0.2729195	total: 6.23s	remaining: 239ms
963:	learn: 0.2728805	total: 6.24s	remaining: 233ms
964:	learn: 0.2728230	total: 6.24s	remaining: 226ms
965:	learn: 0.2727781	total: 6.25s	remaining: 220ms
966:	learn: 0.2727413	total: 6.26s	remaining: 214ms
967:	learn: 0.2726951	total: 6.26s	remaining: 207ms
968:	learn: 0.2726288	total: 6.27s	remaining: 201ms
969:	learn: 0.2725852	total: 6.28s	remaining: 194ms
970:	learn: 0.2725360	total: 6.29s	remaining: 188ms
971:	learn: 0.2725030	total: 6.29s	remaining: 181ms
972:	learn: 0.2724717	total: 6.3s	remaining: 175ms
973:	learn: 0.2724093	total: 6.31s	remaining: 168ms
974:	learn: 0.2723615	total: 6.31s	remaining: 162ms
975:	learn: 0.2723195	total: 6.32s	remaining: 155ms
976:	learn: 0.2722532	total: 6.33s	remaining: 149ms
977:	learn: 0.2722075	total: 6.34s	remaining: 143ms
978:	learn: 0.2721792	total: 6.34s	remaining: 136ms
979:	learn: 0.2721209	total: 6.35s	remaining: 130ms
980:	learn: 0.2720569	total: 6.36s	remaining: 123ms
981:	learn: 0.2720320	total: 6.37s	remaining: 117ms
982:	learn: 0.2719971	total: 6.37s	remaining: 110ms
983:	learn: 0.2719435	total: 6.38s	remaining: 104ms
984:	learn: 0.2718894	total: 6.39s	remaining: 97.3ms
985:	learn: 0.2718407	total: 6.39s	remaining: 90.8ms
986:	learn: 0.2717862	total: 6.4s	remaining: 84.3ms
987:	learn: 0.2717428	total: 6.41s	remaining: 77.8ms
988:	learn: 0.2716663	total: 6.42s	remaining: 71.4ms
989:	learn: 0.2716210	total: 6.42s	remaining: 64.9ms
990:	learn: 0.2715797	total: 6.43s	remaining: 58.4ms
991:	learn: 0.2715417	total: 6.44s	remaining: 51.9ms
992:	learn: 0.2714957	total: 6.44s	remaining: 45.4ms
993:	learn: 0.2714622	total: 6.45s	remaining: 38.9ms
994:	learn: 0.2713902	total: 6.46s	remaining: 32.4ms
995:	learn: 0.2713438	total: 6.46s	remaining: 26ms
996:	learn: 0.2712759	total: 6.47s	remaining: 19.5ms
997:	learn: 0.2712458	total: 6.48s	remaining: 13ms
998:	learn: 0.2711950	total: 6.48s	remaining: 6.49ms
999:	learn: 0.2711510	total: 6.49s	remaining: 0us
In [203]:
cat_finalscore
Out[203]:
0.8632019820670127
In [204]:
r={"Catboost":86.32019820}
Results.append(r)
In [205]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [206]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.88      0.95      0.92     26464
           1       0.76      0.56      0.64      7440

    accuracy                           0.86     33904
   macro avg       0.82      0.75      0.78     33904
weighted avg       0.86      0.86      0.86     33904

df_na_rm¶

In [207]:
r="df_na_rm"
Results.append(r)
In [208]:
df_na_rm ['TempDiff'] = df_na_rm ['MaxTemp'] - df_na_rm ['MinTemp']
df_na_rm =df_na_rm .drop(['MaxTemp','MinTemp'],axis=1)
In [209]:
cols = df_na_rm .columns.tolist()
cols = cols[-1:] + cols[:-1]
df_na_rm  =df_na_rm [cols]
df_na_rm .info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 56452 entries, 6049 to 142302
Data columns (total 19 columns):
 #   Column         Non-Null Count  Dtype         
---  ------         --------------  -----         
 0   TempDiff       56452 non-null  float64       
 1   Date           56452 non-null  datetime64[ns]
 2   Location       56452 non-null  object        
 3   Rainfall       56452 non-null  float64       
 4   Evaporation    56452 non-null  float64       
 5   Sunshine       56452 non-null  float64       
 6   WindGustDir    56452 non-null  object        
 7   WindGustSpeed  56452 non-null  float64       
 8   WindDir9am     56452 non-null  object        
 9   WindDir3pm     56452 non-null  object        
 10  WindSpeed9am   56452 non-null  float64       
 11  WindSpeed3pm   56452 non-null  float64       
 12  Humidity9am    56452 non-null  float64       
 13  Humidity3pm    56452 non-null  float64       
 14  Pressure3pm    56452 non-null  float64       
 15  Cloud9am       56452 non-null  float64       
 16  Cloud3pm       56452 non-null  float64       
 17  RainToday      56452 non-null  object        
 18  RainTomorrow   56452 non-null  object        
dtypes: datetime64[ns](1), float64(12), object(6)
memory usage: 8.6+ MB
In [ ]:
 
In [210]:
fig, ax = plt.subplots(figsize=(12,8))
mask = np.triu(np.ones_like(df_na_rm  .corr(), dtype=np.bool_))
sns.heatmap(df_na_rm  .corr(), annot=True, cmap="Blues", mask=mask, linewidth=0.5)
Out[210]:
<Axes: >
In [211]:
df_na_rm .info()
<class 'pandas.core.frame.DataFrame'>
Int64Index: 56452 entries, 6049 to 142302
Data columns (total 19 columns):
 #   Column         Non-Null Count  Dtype         
---  ------         --------------  -----         
 0   TempDiff       56452 non-null  float64       
 1   Date           56452 non-null  datetime64[ns]
 2   Location       56452 non-null  object        
 3   Rainfall       56452 non-null  float64       
 4   Evaporation    56452 non-null  float64       
 5   Sunshine       56452 non-null  float64       
 6   WindGustDir    56452 non-null  object        
 7   WindGustSpeed  56452 non-null  float64       
 8   WindDir9am     56452 non-null  object        
 9   WindDir3pm     56452 non-null  object        
 10  WindSpeed9am   56452 non-null  float64       
 11  WindSpeed3pm   56452 non-null  float64       
 12  Humidity9am    56452 non-null  float64       
 13  Humidity3pm    56452 non-null  float64       
 14  Pressure3pm    56452 non-null  float64       
 15  Cloud9am       56452 non-null  float64       
 16  Cloud3pm       56452 non-null  float64       
 17  RainToday      56452 non-null  object        
 18  RainTomorrow   56452 non-null  object        
dtypes: datetime64[ns](1), float64(12), object(6)
memory usage: 8.6+ MB
In [212]:
#df_na_rm =pd.concat([Date,Location,finalresult2,RainTomorrow],axis=1)
df_na_rm .shape
Out[212]:
(56452, 19)

df_outliers__colna_rm ['Date'] = pd.to_numeric(pd.to_datetime(df_outliers__colna_rm ['Date'])) df_outliers__colna_rm .columns = [c.replace(' ', '_') for c in df_outliers__colna_rm ] df_outliers__colna_rm ['Location'] = df_outliers__colna_rm ['Location'].str.lower()

In [213]:
cat_cols
Out[213]:
['Date',
 'Location',
 'WindGustDir',
 'WindDir9am',
 'WindDir3pm',
 'RainToday',
 'RainTomorrow']
In [214]:
le = LabelEncoder()
df_na_rm [cat_cols] =df_na_rm [cat_cols].astype('str').apply(le.fit_transform)
In [215]:
df_na_rm .isnull().sum()
Out[215]:
TempDiff         0
Date             0
Location         0
Rainfall         0
Evaporation      0
Sunshine         0
WindGustDir      0
WindGustSpeed    0
WindDir9am       0
WindDir3pm       0
WindSpeed9am     0
WindSpeed3pm     0
Humidity9am      0
Humidity3pm      0
Pressure3pm      0
Cloud9am         0
Cloud3pm         0
RainToday        0
RainTomorrow     0
dtype: int64
In [216]:
df_na_rm .head()
Out[216]:
TempDiff Date Location Rainfall Evaporation Sunshine WindGustDir WindGustSpeed WindDir9am WindDir3pm WindSpeed9am WindSpeed3pm Humidity9am Humidity3pm Pressure3pm Cloud9am Cloud3pm RainToday RainTomorrow
6049 17.3 407 4 0.0 12.0 12.3 11 48.0 1 12 6.0 20.0 20.0 13.0 1004.4 2.0 5.0 0 0
6050 10.5 408 4 0.0 14.8 13.0 8 37.0 10 10 19.0 19.0 30.0 8.0 1012.1 1.0 1.0 0 0
6052 18.2 410 4 0.0 10.8 10.6 5 46.0 5 6 30.0 15.0 42.0 22.0 1009.2 1.0 6.0 0 0
6053 16.5 411 4 0.0 11.4 12.2 14 31.0 14 15 6.0 6.0 37.0 22.0 1009.1 1.0 5.0 0 0
6054 16.8 412 4 0.0 11.2 8.4 14 35.0 7 14 17.0 13.0 19.0 15.0 1007.4 1.0 6.0 0 0
In [217]:
X = df_na_rm .iloc[:, :-1]
y = df_na_rm .iloc[:, -1:]

X_train, X_test, y_train, y_test = train_test_split(X, y,
                                                   test_size=0.3, random_state = 1)
In [ ]:
 
In [218]:
#Logistic Regression
from sklearn.metrics import classification_report, confusion_matrix
from sklearn.linear_model import LogisticRegression
lr = LogisticRegression()
lr.fit(X_train,y_train)
Out[218]:
LogisticRegression()
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
LogisticRegression()
In [219]:
lr.intercept_
Out[219]:
array([0.00030617])
In [220]:
lr.coef_
Out[220]:
array([[ 2.50295632e-02,  2.96300757e-05, -1.02713632e-02,
         1.34332464e-02,  9.95368769e-03, -1.11603094e-01,
         1.31732970e-02,  7.37417748e-02, -3.69588531e-02,
        -7.89229233e-03, -2.41738901e-02, -3.22259069e-02,
         8.03976682e-04,  6.05846494e-02, -6.54038588e-03,
         3.61637910e-02,  7.71214931e-02,  6.54954787e-03]])
In [ ]:
 
In [221]:
predictions = lr.predict(X_test)
lr.score(X_train,y_train)
Out[221]:
0.8476060329992914
In [222]:
r={"LogisticRegression":84.7606032}
Results.append(r)
In [223]:
print(confusion_matrix(y_test, predictions))
cm = confusion_matrix(y_test, predictions)

fig, ax = plt.subplots(figsize=(8, 8))
ax.imshow(cm)
ax.grid(False)
ax.xaxis.set(ticks=(0, 1), ticklabels=('Predicted 0s', 'Predicted 1s'))
ax.yaxis.set(ticks=(0, 1), ticklabels=('Actual 0s', 'Actual 1s'))
ax.set_ylim(1.5, -0.5)
for i in range(2):
    for j in range(2):
        ax.text(j, i, cm[i, j], ha='center', va='center', color='red')
plt.show()
[[12508   665]
 [ 1815  1948]]
In [224]:
    print(classification_report(y_test, predictions))
              precision    recall  f1-score   support

           0       0.87      0.95      0.91     13173
           1       0.75      0.52      0.61      3763

    accuracy                           0.85     16936
   macro avg       0.81      0.73      0.76     16936
weighted avg       0.84      0.85      0.84     16936

In [225]:
print(accuracy_score(y_test, predictions))
0.8535663675011809
In [ ]:
 
In [226]:
#ROC Curve
from sklearn.metrics import roc_auc_score
from sklearn.metrics import roc_curve
logit_roc_auc = roc_auc_score(y_test, predictions)
fpr, tpr, thresholds = roc_curve(y_test, lr.predict_proba(X_test)[:,1])
plt.figure()
plt.plot(fpr, tpr, label='Logistic Regression (area = %0.2f)' % logit_roc_auc)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.savefig('Log_ROC')
plt.show()
In [ ]:
 
In [227]:
from xgboost import XGBClassifier
In [228]:
xgb_model = XGBClassifier().fit(X_train, y_train)
In [229]:
y_pred = xgb_model.predict(X_test)
In [230]:
accuracy_score(y_test, y_pred)
Out[230]:
0.8633679735474729
In [231]:
r={"xgboost":86.336797}
Results.append(r)
In [232]:
from sklearn.metrics import mean_squared_error
rmse = mean_squared_error(y_test,y_pred, squared=False)
print(f"RMSE of the base model: {rmse:.3f}")
RMSE of the base model: 0.370
In [233]:
print(confusion_matrix(y_test, y_pred))
cm = confusion_matrix(y_test, y_pred)

fig, ax = plt.subplots(figsize=(8, 8))
ax.imshow(cm)
ax.grid(False)
ax.xaxis.set(ticks=(0, 1), ticklabels=('Predicted 0s', 'Predicted 1s'))
ax.yaxis.set(ticks=(0, 1), ticklabels=('Actual 0s', 'Actual 1s'))
ax.set_ylim(1.5, -0.5)
for i in range(2):
    for j in range(2):
        ax.text(j, i, cm[i, j], ha='center', va='center', color='red')
plt.show()
[[12423   750]
 [ 1564  2199]]
In [234]:
# Compute micro-average ROC curve and ROC area
xgb_roc_auc = roc_auc_score(y_test, y_pred)
fpr, tpr, thresholds = roc_curve(y_test, xgb_model.predict_proba(X_test)[:,1])
plt.figure()
plt.plot(fpr, tpr, label='xgb_model (area = %0.2f)' % logit_roc_auc)
plt.plot([0, 1], [0, 1],'r--')
plt.xlim([0.0, 1.0])
plt.ylim([0.0, 1.05])
plt.xlabel('False Positive Rate')
plt.ylabel('True Positive Rate')
plt.title('Receiver operating characteristic')
plt.legend(loc="lower right")
plt.savefig('Log_ROC')
plt.show()
In [235]:
print(classification_report(y_test, y_pred))
              precision    recall  f1-score   support

           0       0.89      0.94      0.91     13173
           1       0.75      0.58      0.66      3763

    accuracy                           0.86     16936
   macro avg       0.82      0.76      0.79     16936
weighted avg       0.86      0.86      0.86     16936

In [236]:
#RandomForestClassifier
rf= RandomForestClassifier()
rf.fit(X_train, y_train)
y_pred = rf.predict(X_test)
In [237]:
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)
Accuracy: 0.8633679735474729
In [238]:
r={"RandomForestClassifier":86.10061}
Results.append(r)
In [239]:
# Export the first three decision trees from the forest
from sklearn.tree import export_graphviz
from IPython.display import Image
import graphviz
for i in range(3):
    tree = rf.estimators_[i]
    dot_data = export_graphviz(tree,
                               feature_names=X_train.columns,  
                               filled=True,  
                               max_depth=2, 
                               impurity=False, 
                               proportion=True)
    graph = graphviz.Source(dot_data)
    display(graph)
Tree 0 Humidity3pm <= 69.5 samples = 100.0% value = [0.778, 0.222] 1 Sunshine <= 7.65 samples = 84.3% value = [0.858, 0.142] 0->1 True 6044 Cloud3pm <= 6.5 samples = 15.7% value = [0.348, 0.652] 0->6044 False 2 Humidity3pm <= 60.5 samples = 28.3% value = [0.718, 0.282] 1->2 3271 Humidity3pm <= 54.5 samples = 56.0% value = [0.93, 0.07] 1->3271 3 (...) 2->3 2080 (...) 2->2080 3272 (...) 3271->3272 5013 (...) 3271->5013 6045 RainToday <= 0.5 samples = 3.7% value = [0.542, 0.458] 6044->6045 6534 Cloud3pm <= 7.5 samples = 12.0% value = [0.29, 0.71] 6044->6534 6046 (...) 6045->6046 6299 (...) 6045->6299 6535 (...) 6534->6535 7422 (...) 6534->7422
Tree 0 Humidity3pm <= 67.5 samples = 100.0% value = [0.78, 0.22] 1 Rainfall <= 0.75 samples = 81.9% value = [0.868, 0.132] 0->1 True 5706 Cloud3pm <= 6.5 samples = 18.1% value = [0.385, 0.615] 0->5706 False 2 WindGustSpeed <= 56.5 samples = 67.0% value = [0.898, 0.102] 1->2 4083 Pressure3pm <= 1012.25 samples = 14.9% value = [0.733, 0.267] 1->4083 3 (...) 2->3 3372 (...) 2->3372 4084 (...) 4083->4084 4847 (...) 4083->4847 5707 Pressure3pm <= 1012.55 samples = 4.9% value = [0.584, 0.416] 5706->5707 6310 WindSpeed9am <= 16.0 samples = 13.2% value = [0.309, 0.691] 5706->6310 5708 (...) 5707->5708 5993 (...) 5707->5993 6311 (...) 6310->6311 7182 (...) 6310->7182
Tree 0 Sunshine <= 6.25 samples = 100.0% value = [0.779, 0.221] 1 Sunshine <= 1.95 samples = 32.1% value = [0.536, 0.464] 0->1 True 3848 TempDiff <= 9.25 samples = 67.9% value = [0.894, 0.106] 0->3848 False 2 Cloud3pm <= 7.5 samples = 11.2% value = [0.415, 0.585] 1->2 1313 Cloud3pm <= 7.5 samples = 20.8% value = [0.601, 0.399] 1->1313 3 (...) 2->3 780 (...) 2->780 1314 (...) 1313->1314 3543 (...) 1313->3543 3849 Humidity3pm <= 70.5 samples = 19.6% value = [0.795, 0.205] 3848->3849 5738 Rainfall <= 1.1 samples = 48.3% value = [0.933, 0.067] 3848->5738 3850 (...) 3849->3850 5423 (...) 3849->5423 5739 (...) 5738->5739 7710 (...) 5738->7710
In [ ]:
 
In [240]:
# Create the confusion matrix
from sklearn.metrics import ConfusionMatrixDisplay
cm = confusion_matrix(y_test, y_pred)

ConfusionMatrixDisplay(confusion_matrix=cm).plot();
In [241]:
accuracy = accuracy_score(y_test, y_pred)
precision = precision_score(y_test, y_pred)
recall = recall_score(y_test, y_pred)
from sklearn.metrics import mean_absolute_error, mean_squared_error

print("Accuracy:", accuracy)
print("Precision:", precision)
print("Recall:", recall)

print('Mean Absolute Error:', mean_absolute_error(y_test, y_pred))
print('Mean Squared Error:', mean_squared_error(y_test, y_pred))
print('Root Mean Squared Error:', np.sqrt(mean_squared_error(y_test, y_pred)))
Accuracy: 0.8633679735474729
Precision: 0.7789757412398922
Recall: 0.537602976348658
Mean Absolute Error: 0.13663202645252717
Mean Squared Error: 0.13663202645252717
Root Mean Squared Error: 0.3696376962006543
In [242]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.88      0.96      0.92     13173
           1       0.78      0.54      0.64      3763

    accuracy                           0.86     16936
   macro avg       0.83      0.75      0.78     16936
weighted avg       0.86      0.86      0.85     16936

In [243]:
# Organizing feature names and importances in a DataFrame
features_df = pd.DataFrame({'features': rf.feature_names_in_, 'importances': rf.feature_importances_ })

# Sorting data from highest to lowest
features_df_sorted = features_df.sort_values(by='importances', ascending=False)

# Barplot of the result without borders and axis lines
g = sns.barplot(data=features_df_sorted, x='importances', y ='features', palette="rocket")
sns.despine(bottom = True, left = True)
g.set_title('Feature importances')
g.set(xlabel=None)
g.set(ylabel=None)
g.set(xticks=[])
for value in g.containers:
    g.bar_label(value, padding=2)
In [244]:
#KNeighborsClassifier
In [245]:
#KNeighborsClassifier
knn = KNeighborsClassifier()
knn.fit(X_train, y_train)
knn_head = knn.predict(X_test)

print(f"""
accuracy_score: {accuracy_score(knn_head, y_test)}
roc_auc_score: {roc_auc_score(knn_head, y_test)}
""")
accuracy_score: 0.8395725082664147
roc_auc_score: 0.7860291970672961

In [246]:
def found_good_neighbors_1(n, p):
    knn = KNeighborsClassifier(n_neighbors=n, p=p, 
                               metric='minkowski')
    knn.fit(X_train, y_train)
    return knn.score(X_test, y_test)

def found_goot_depth(n, criterion_):
    tree = DecisionTreeClassifier(max_depth=n, 
                                  criterion=criterion_,
                                  random_state=42)
    tree.fit(X_train, y_train)
    return tree.score(X_test, y_test)
In [247]:
knn_1 = [found_good_neighbors_1(n, 1) for n in range(1, 22, 2)]
knn_2 = [found_good_neighbors_1(n, 2) for n in range(1, 22, 2)]
In [248]:
tree_gini = [found_goot_depth(n, 'gini') for n in range(1, 22, 2)]
tree_entropy = [found_goot_depth(n, 'entropy') for n in range(1, 22, 2)]
In [249]:
len(knn_1)
Out[249]:
11
In [250]:
l=knn_1[0]
a=1
for i in range(0,10):
    if (knn_1[i]>l):
        l=knn_1[i]
        if(i==0):
            a=1
        else:
            a=(i*2)+1
            
       
print (l,a)
0.8487246102975909 11
In [251]:
plt.figure(figsize=(12, 7))
plt.subplot(2, 2, 1)
plt.plot(tree_gini)
plt.title('tree_gini')
plt.legend(['score'])
plt.subplot(2, 2, 2)
plt.plot(tree_entropy)
plt.title('tree_entropy')
plt.legend(['score'])
plt.subplot(2, 2, 3)
plt.plot(knn_1)
plt.title('knn_1')
plt.legend(['score'])
plt.subplot(2, 2, 4)
plt.plot(knn_2)
plt.title('knn_2')
plt.legend(['score'])
plt.show()
In [252]:
print(f"""
tree_gini: {max(tree_gini)}
tree_entropy: {max(tree_entropy)}
knn_1: {max(knn_1)}
knn_2: {max(knn_2)}
""")
tree_gini: 0.8488427019367029
tree_entropy: 0.8454180444024563
knn_1: 0.8487246102975909
knn_2: 0.8457132735002362

As we can see the decisive trees begin to fall at a depth of 4-5. What we cannot say about the nearest-neighbor method. I think we should still do tests starting from 20 to 50 in increments of 3 for nearest neighbours

In [253]:
knn_1 = [found_good_neighbors_1(n, 1) for n in range(20, 51, 3)]
knn_2 = [found_good_neighbors_1(n, 2) for n in range(20, 51, 3)]
In [254]:
plt.figure(figsize=(14, 9))
plt.subplot(2,2,1)
plt.plot(knn_1)
plt.title('knn_1')
plt.legend(['score'])
plt.subplot(2, 2, 2)
plt.plot(knn_2)
plt.title('knn_2')
plt.legend(['score'])
plt.show()
In [255]:
print(f"""
knn_1: {max(knn_1)}
knn_2: {max(knn_2)}
""")
knn_1: 0.8467760982522438
knn_2: 0.8432923948984412

In [256]:
#knn_1: 0.836153544849197 at k=9
knn = KNeighborsClassifier(n_neighbors=11, p=1, 
                               metric='minkowski')
knn.fit(X_train, y_train)
knn_head = knn.predict(X_test)

print(f"""
accuracy_score: {accuracy_score(knn_head, y_test)}
roc_auc_score: {roc_auc_score(knn_head, y_test)}
""")
accuracy_score: 0.8487246102975909
roc_auc_score: 0.8171481745927889

In [257]:
r={"knn":84.87246102}
Results.append(r)
In [258]:
# Evaluate Model
# Create the confusion matrix
#from sklearn.metrics import ConfusionMatrixDisplay
cm = confusion_matrix(y_test, knn_head)

ConfusionMatrixDisplay(confusion_matrix=cm).plot();
In [259]:
print(classification_report(y_test,knn_head))
              precision    recall  f1-score   support

           0       0.86      0.96      0.91     13173
           1       0.77      0.45      0.57      3763

    accuracy                           0.85     16936
   macro avg       0.82      0.71      0.74     16936
weighted avg       0.84      0.85      0.83     16936

In [260]:
#t Gaussian Naive Bayes
from sklearn.naive_bayes import GaussianNB
gnb = GaussianNB()
gnb.fit(X_train, y_train)
y_pred = gnb.predict(X_test)
accuracy = accuracy_score(y_test, y_pred)
print("Accuracy:", accuracy)
report = classification_report(y_test, y_pred)
print("Classification Report:\n", report)
Accuracy: 0.7970595181861124
Classification Report:
               precision    recall  f1-score   support

           0       0.91      0.82      0.86     13173
           1       0.53      0.71      0.61      3763

    accuracy                           0.80     16936
   macro avg       0.72      0.76      0.74     16936
weighted avg       0.82      0.80      0.81     16936

In [261]:
r={"Gaussian Naive Bayes":79.70595181}
Results.append(r)
In [262]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [263]:
from sklearn.metrics import roc_auc_score

ROC_AUC = roc_auc_score(y_test, y_pred)

print('ROC AUC : {:.4f}'.format(ROC_AUC))
ROC AUC : 0.7644
In [264]:
# calculate cross-validated ROC AUC 

from sklearn.model_selection import cross_val_score

Cross_validated_ROC_AUC = cross_val_score(gnb, X_train, y_train, cv=5, scoring='roc_auc').mean()

print('Cross validated ROC AUC : {:.4f}'.format(Cross_validated_ROC_AUC))
Cross validated ROC AUC : 0.8419
In [265]:
#Gradient Boosting Classifier¶
from sklearn.ensemble import GradientBoostingClassifier
gbm_model = GradientBoostingClassifier().fit(X_train, y_train)
y_pred = gbm_model.predict(X_test)
accuracy_score(y_test, y_pred)
Out[265]:
0.858290033065659
In [266]:
r={"gradient Boosting Classifier":85.8349078885}
Results.append(r)
In [267]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [268]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.88      0.95      0.91     13173
           1       0.75      0.54      0.63      3763

    accuracy                           0.86     16936
   macro avg       0.82      0.75      0.77     16936
weighted avg       0.85      0.86      0.85     16936

In [269]:
#LightGBM

from lightgbm import LGBMClassifier
lgbm_model = LGBMClassifier().fit(X_train, y_train)
y_pred = lgbm_model.predict(X_test)
accuracy_score(y_test, y_pred)
Out[269]:
0.8649622106754842
In [270]:
r={"LightGBM":86.4962210675}
Results.append(r)
In [271]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [272]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.89      0.95      0.92     13173
           1       0.76      0.57      0.65      3763

    accuracy                           0.86     16936
   macro avg       0.82      0.76      0.78     16936
weighted avg       0.86      0.86      0.86     16936

In [273]:
print('Training accuracy {:.4f}'.format(lgbm_model.score(X_train,y_train)))
print('Testing accuracy {:.4f}'.format(lgbm_model.score(X_test,y_test)))
Training accuracy 0.8844
Testing accuracy 0.8650
In [274]:
# As we can clearly see that there is absolutely no significant difference between both the accuracies and hence the model has made an estimation that is quite accurate.
In [275]:
import lightgbm as lgb
lgb.plot_importance(lgbm_model)
Out[275]:
<Axes: title={'center': 'Feature importance'}, xlabel='Feature importance', ylabel='Features'>
In [276]:
lgb.plot_tree(lgbm_model,figsize=(30,40))
Out[276]:
<Axes: >
In [ ]:
 
In [277]:
#Catboost
from catboost import CatBoostClassifier, Pool
cat = CatBoostClassifier()
cat.fit(X_train, y_train)
y_pred = cat.predict(X_test)
cat_finalscore = accuracy_score(y_test, y_pred)
Learning rate set to 0.049517
0:	learn: 0.6528546	total: 8.13ms	remaining: 8.12s
1:	learn: 0.6192730	total: 12.2ms	remaining: 6.06s
2:	learn: 0.5869369	total: 16.7ms	remaining: 5.55s
3:	learn: 0.5613419	total: 20.7ms	remaining: 5.15s
4:	learn: 0.5355998	total: 25ms	remaining: 4.98s
5:	learn: 0.5146134	total: 29.3ms	remaining: 4.85s
6:	learn: 0.4958685	total: 33.2ms	remaining: 4.71s
7:	learn: 0.4788958	total: 37.7ms	remaining: 4.68s
8:	learn: 0.4639151	total: 41.6ms	remaining: 4.58s
9:	learn: 0.4513773	total: 45.4ms	remaining: 4.49s
10:	learn: 0.4406966	total: 50.3ms	remaining: 4.52s
11:	learn: 0.4314914	total: 54.9ms	remaining: 4.52s
12:	learn: 0.4226309	total: 58.5ms	remaining: 4.44s
13:	learn: 0.4149273	total: 62.2ms	remaining: 4.38s
14:	learn: 0.4076492	total: 66.4ms	remaining: 4.36s
15:	learn: 0.4016590	total: 71ms	remaining: 4.36s
16:	learn: 0.3955887	total: 74.8ms	remaining: 4.32s
17:	learn: 0.3907154	total: 79.1ms	remaining: 4.31s
18:	learn: 0.3872765	total: 82.4ms	remaining: 4.26s
19:	learn: 0.3827674	total: 87.8ms	remaining: 4.3s
20:	learn: 0.3792592	total: 91.9ms	remaining: 4.28s
21:	learn: 0.3755712	total: 96.9ms	remaining: 4.3s
22:	learn: 0.3726215	total: 102ms	remaining: 4.33s
23:	learn: 0.3696402	total: 106ms	remaining: 4.31s
24:	learn: 0.3664539	total: 110ms	remaining: 4.29s
25:	learn: 0.3642604	total: 115ms	remaining: 4.29s
26:	learn: 0.3618883	total: 119ms	remaining: 4.27s
27:	learn: 0.3596816	total: 124ms	remaining: 4.31s
28:	learn: 0.3577649	total: 128ms	remaining: 4.3s
29:	learn: 0.3560347	total: 132ms	remaining: 4.27s
30:	learn: 0.3540688	total: 136ms	remaining: 4.26s
31:	learn: 0.3523378	total: 140ms	remaining: 4.24s
32:	learn: 0.3506091	total: 145ms	remaining: 4.24s
33:	learn: 0.3492876	total: 149ms	remaining: 4.24s
34:	learn: 0.3480675	total: 153ms	remaining: 4.22s
35:	learn: 0.3469292	total: 158ms	remaining: 4.24s
36:	learn: 0.3462177	total: 163ms	remaining: 4.24s
37:	learn: 0.3452446	total: 167ms	remaining: 4.24s
38:	learn: 0.3443574	total: 171ms	remaining: 4.22s
39:	learn: 0.3436819	total: 177ms	remaining: 4.24s
40:	learn: 0.3426116	total: 181ms	remaining: 4.24s
41:	learn: 0.3417909	total: 187ms	remaining: 4.26s
42:	learn: 0.3410250	total: 191ms	remaining: 4.25s
43:	learn: 0.3402555	total: 194ms	remaining: 4.22s
44:	learn: 0.3394674	total: 199ms	remaining: 4.21s
45:	learn: 0.3388125	total: 204ms	remaining: 4.23s
46:	learn: 0.3382500	total: 208ms	remaining: 4.23s
47:	learn: 0.3375865	total: 213ms	remaining: 4.23s
48:	learn: 0.3368895	total: 218ms	remaining: 4.23s
49:	learn: 0.3362309	total: 223ms	remaining: 4.23s
50:	learn: 0.3356599	total: 226ms	remaining: 4.21s
51:	learn: 0.3352256	total: 231ms	remaining: 4.21s
52:	learn: 0.3346716	total: 235ms	remaining: 4.2s
53:	learn: 0.3341893	total: 240ms	remaining: 4.2s
54:	learn: 0.3337281	total: 244ms	remaining: 4.19s
55:	learn: 0.3332935	total: 248ms	remaining: 4.18s
56:	learn: 0.3328485	total: 252ms	remaining: 4.17s
57:	learn: 0.3324460	total: 256ms	remaining: 4.15s
58:	learn: 0.3320560	total: 259ms	remaining: 4.13s
59:	learn: 0.3317581	total: 263ms	remaining: 4.13s
60:	learn: 0.3313818	total: 268ms	remaining: 4.12s
61:	learn: 0.3309365	total: 272ms	remaining: 4.11s
62:	learn: 0.3304684	total: 275ms	remaining: 4.1s
63:	learn: 0.3300931	total: 280ms	remaining: 4.09s
64:	learn: 0.3295836	total: 284ms	remaining: 4.08s
65:	learn: 0.3292126	total: 288ms	remaining: 4.07s
66:	learn: 0.3288732	total: 293ms	remaining: 4.07s
67:	learn: 0.3286360	total: 296ms	remaining: 4.06s
68:	learn: 0.3281754	total: 300ms	remaining: 4.05s
69:	learn: 0.3278941	total: 305ms	remaining: 4.05s
70:	learn: 0.3276234	total: 308ms	remaining: 4.04s
71:	learn: 0.3274190	total: 312ms	remaining: 4.02s
72:	learn: 0.3271125	total: 317ms	remaining: 4.02s
73:	learn: 0.3268673	total: 321ms	remaining: 4.01s
74:	learn: 0.3265472	total: 325ms	remaining: 4.01s
75:	learn: 0.3263277	total: 329ms	remaining: 4s
76:	learn: 0.3260402	total: 333ms	remaining: 4s
77:	learn: 0.3257197	total: 338ms	remaining: 3.99s
78:	learn: 0.3255252	total: 342ms	remaining: 3.98s
79:	learn: 0.3253041	total: 346ms	remaining: 3.98s
80:	learn: 0.3251138	total: 350ms	remaining: 3.97s
81:	learn: 0.3247886	total: 355ms	remaining: 3.97s
82:	learn: 0.3244969	total: 359ms	remaining: 3.96s
83:	learn: 0.3242476	total: 363ms	remaining: 3.96s
84:	learn: 0.3239244	total: 367ms	remaining: 3.95s
85:	learn: 0.3237227	total: 370ms	remaining: 3.94s
86:	learn: 0.3234998	total: 374ms	remaining: 3.93s
87:	learn: 0.3232762	total: 379ms	remaining: 3.93s
88:	learn: 0.3231002	total: 382ms	remaining: 3.91s
89:	learn: 0.3229001	total: 386ms	remaining: 3.9s
90:	learn: 0.3226728	total: 390ms	remaining: 3.9s
91:	learn: 0.3225313	total: 395ms	remaining: 3.9s
92:	learn: 0.3223285	total: 399ms	remaining: 3.89s
93:	learn: 0.3221278	total: 403ms	remaining: 3.89s
94:	learn: 0.3218919	total: 407ms	remaining: 3.88s
95:	learn: 0.3216675	total: 411ms	remaining: 3.87s
96:	learn: 0.3214379	total: 415ms	remaining: 3.86s
97:	learn: 0.3212523	total: 419ms	remaining: 3.85s
98:	learn: 0.3209857	total: 423ms	remaining: 3.85s
99:	learn: 0.3207024	total: 428ms	remaining: 3.85s
100:	learn: 0.3204909	total: 431ms	remaining: 3.84s
101:	learn: 0.3202085	total: 435ms	remaining: 3.83s
102:	learn: 0.3200339	total: 439ms	remaining: 3.83s
103:	learn: 0.3198193	total: 443ms	remaining: 3.82s
104:	learn: 0.3196696	total: 448ms	remaining: 3.82s
105:	learn: 0.3195257	total: 451ms	remaining: 3.81s
106:	learn: 0.3193198	total: 456ms	remaining: 3.8s
107:	learn: 0.3191651	total: 461ms	remaining: 3.81s
108:	learn: 0.3189747	total: 467ms	remaining: 3.81s
109:	learn: 0.3187394	total: 471ms	remaining: 3.81s
110:	learn: 0.3185703	total: 476ms	remaining: 3.81s
111:	learn: 0.3184844	total: 480ms	remaining: 3.81s
112:	learn: 0.3182390	total: 485ms	remaining: 3.81s
113:	learn: 0.3179950	total: 489ms	remaining: 3.8s
114:	learn: 0.3178895	total: 494ms	remaining: 3.8s
115:	learn: 0.3177166	total: 498ms	remaining: 3.79s
116:	learn: 0.3175447	total: 502ms	remaining: 3.79s
117:	learn: 0.3174262	total: 506ms	remaining: 3.78s
118:	learn: 0.3172587	total: 510ms	remaining: 3.77s
119:	learn: 0.3171179	total: 514ms	remaining: 3.77s
120:	learn: 0.3169574	total: 517ms	remaining: 3.76s
121:	learn: 0.3168117	total: 523ms	remaining: 3.77s
122:	learn: 0.3167111	total: 527ms	remaining: 3.76s
123:	learn: 0.3164779	total: 532ms	remaining: 3.76s
124:	learn: 0.3163058	total: 536ms	remaining: 3.75s
125:	learn: 0.3162327	total: 541ms	remaining: 3.75s
126:	learn: 0.3161241	total: 545ms	remaining: 3.75s
127:	learn: 0.3159065	total: 550ms	remaining: 3.74s
128:	learn: 0.3155597	total: 554ms	remaining: 3.74s
129:	learn: 0.3153773	total: 558ms	remaining: 3.73s
130:	learn: 0.3151360	total: 562ms	remaining: 3.73s
131:	learn: 0.3149926	total: 566ms	remaining: 3.72s
132:	learn: 0.3148696	total: 570ms	remaining: 3.72s
133:	learn: 0.3147187	total: 574ms	remaining: 3.71s
134:	learn: 0.3145797	total: 579ms	remaining: 3.71s
135:	learn: 0.3142617	total: 583ms	remaining: 3.7s
136:	learn: 0.3141177	total: 588ms	remaining: 3.7s
137:	learn: 0.3139976	total: 592ms	remaining: 3.7s
138:	learn: 0.3138204	total: 596ms	remaining: 3.69s
139:	learn: 0.3136781	total: 600ms	remaining: 3.69s
140:	learn: 0.3135502	total: 605ms	remaining: 3.68s
141:	learn: 0.3134222	total: 609ms	remaining: 3.68s
142:	learn: 0.3132617	total: 614ms	remaining: 3.68s
143:	learn: 0.3131338	total: 618ms	remaining: 3.67s
144:	learn: 0.3127919	total: 623ms	remaining: 3.67s
145:	learn: 0.3126580	total: 627ms	remaining: 3.67s
146:	learn: 0.3125075	total: 631ms	remaining: 3.66s
147:	learn: 0.3121307	total: 635ms	remaining: 3.66s
148:	learn: 0.3119494	total: 640ms	remaining: 3.66s
149:	learn: 0.3117773	total: 645ms	remaining: 3.65s
150:	learn: 0.3116150	total: 649ms	remaining: 3.65s
151:	learn: 0.3114256	total: 654ms	remaining: 3.65s
152:	learn: 0.3112908	total: 657ms	remaining: 3.64s
153:	learn: 0.3111353	total: 662ms	remaining: 3.63s
154:	learn: 0.3109532	total: 666ms	remaining: 3.63s
155:	learn: 0.3108093	total: 670ms	remaining: 3.62s
156:	learn: 0.3106648	total: 674ms	remaining: 3.62s
157:	learn: 0.3104025	total: 679ms	remaining: 3.62s
158:	learn: 0.3102547	total: 684ms	remaining: 3.62s
159:	learn: 0.3101763	total: 688ms	remaining: 3.61s
160:	learn: 0.3100367	total: 692ms	remaining: 3.61s
161:	learn: 0.3098580	total: 696ms	remaining: 3.6s
162:	learn: 0.3097477	total: 700ms	remaining: 3.59s
163:	learn: 0.3096362	total: 703ms	remaining: 3.58s
164:	learn: 0.3095301	total: 708ms	remaining: 3.58s
165:	learn: 0.3094465	total: 713ms	remaining: 3.58s
166:	learn: 0.3093342	total: 718ms	remaining: 3.58s
167:	learn: 0.3091779	total: 722ms	remaining: 3.57s
168:	learn: 0.3090233	total: 726ms	remaining: 3.57s
169:	learn: 0.3089483	total: 730ms	remaining: 3.56s
170:	learn: 0.3088391	total: 734ms	remaining: 3.56s
171:	learn: 0.3086584	total: 737ms	remaining: 3.55s
172:	learn: 0.3084911	total: 742ms	remaining: 3.54s
173:	learn: 0.3083122	total: 746ms	remaining: 3.54s
174:	learn: 0.3082128	total: 749ms	remaining: 3.53s
175:	learn: 0.3080722	total: 753ms	remaining: 3.53s
176:	learn: 0.3079335	total: 757ms	remaining: 3.52s
177:	learn: 0.3077496	total: 761ms	remaining: 3.52s
178:	learn: 0.3076134	total: 765ms	remaining: 3.51s
179:	learn: 0.3075056	total: 769ms	remaining: 3.5s
180:	learn: 0.3074364	total: 773ms	remaining: 3.5s
181:	learn: 0.3073293	total: 777ms	remaining: 3.49s
182:	learn: 0.3071743	total: 781ms	remaining: 3.49s
183:	learn: 0.3070360	total: 786ms	remaining: 3.48s
184:	learn: 0.3069387	total: 790ms	remaining: 3.48s
185:	learn: 0.3068301	total: 793ms	remaining: 3.47s
186:	learn: 0.3067294	total: 797ms	remaining: 3.46s
187:	learn: 0.3066424	total: 801ms	remaining: 3.46s
188:	learn: 0.3065363	total: 807ms	remaining: 3.46s
189:	learn: 0.3063942	total: 812ms	remaining: 3.46s
190:	learn: 0.3062597	total: 817ms	remaining: 3.46s
191:	learn: 0.3061465	total: 821ms	remaining: 3.46s
192:	learn: 0.3059990	total: 825ms	remaining: 3.45s
193:	learn: 0.3058033	total: 830ms	remaining: 3.45s
194:	learn: 0.3056852	total: 834ms	remaining: 3.44s
195:	learn: 0.3055923	total: 838ms	remaining: 3.44s
196:	learn: 0.3054981	total: 842ms	remaining: 3.43s
197:	learn: 0.3053997	total: 847ms	remaining: 3.43s
198:	learn: 0.3052213	total: 851ms	remaining: 3.42s
199:	learn: 0.3051077	total: 856ms	remaining: 3.42s
200:	learn: 0.3050097	total: 860ms	remaining: 3.42s
201:	learn: 0.3048628	total: 864ms	remaining: 3.41s
202:	learn: 0.3046528	total: 869ms	remaining: 3.41s
203:	learn: 0.3045136	total: 873ms	remaining: 3.41s
204:	learn: 0.3043818	total: 877ms	remaining: 3.4s
205:	learn: 0.3042370	total: 881ms	remaining: 3.39s
206:	learn: 0.3041234	total: 884ms	remaining: 3.39s
207:	learn: 0.3039562	total: 889ms	remaining: 3.38s
208:	learn: 0.3038188	total: 893ms	remaining: 3.38s
209:	learn: 0.3037102	total: 897ms	remaining: 3.37s
210:	learn: 0.3035812	total: 902ms	remaining: 3.37s
211:	learn: 0.3034274	total: 906ms	remaining: 3.37s
212:	learn: 0.3033073	total: 910ms	remaining: 3.36s
213:	learn: 0.3031992	total: 914ms	remaining: 3.35s
214:	learn: 0.3031145	total: 917ms	remaining: 3.35s
215:	learn: 0.3029825	total: 921ms	remaining: 3.34s
216:	learn: 0.3028407	total: 925ms	remaining: 3.34s
217:	learn: 0.3027274	total: 929ms	remaining: 3.33s
218:	learn: 0.3025813	total: 932ms	remaining: 3.33s
219:	learn: 0.3024122	total: 936ms	remaining: 3.32s
220:	learn: 0.3022350	total: 940ms	remaining: 3.31s
221:	learn: 0.3020928	total: 945ms	remaining: 3.31s
222:	learn: 0.3019667	total: 949ms	remaining: 3.31s
223:	learn: 0.3018394	total: 953ms	remaining: 3.3s
224:	learn: 0.3016765	total: 957ms	remaining: 3.3s
225:	learn: 0.3015352	total: 962ms	remaining: 3.29s
226:	learn: 0.3014425	total: 965ms	remaining: 3.29s
227:	learn: 0.3013253	total: 970ms	remaining: 3.28s
228:	learn: 0.3012105	total: 974ms	remaining: 3.28s
229:	learn: 0.3010472	total: 978ms	remaining: 3.27s
230:	learn: 0.3009506	total: 981ms	remaining: 3.27s
231:	learn: 0.3008341	total: 985ms	remaining: 3.26s
232:	learn: 0.3007277	total: 989ms	remaining: 3.26s
233:	learn: 0.3006133	total: 993ms	remaining: 3.25s
234:	learn: 0.3004842	total: 997ms	remaining: 3.25s
235:	learn: 0.3004045	total: 1s	remaining: 3.24s
236:	learn: 0.3002665	total: 1s	remaining: 3.23s
237:	learn: 0.3001550	total: 1.01s	remaining: 3.23s
238:	learn: 0.3000104	total: 1.01s	remaining: 3.23s
239:	learn: 0.2998353	total: 1.02s	remaining: 3.22s
240:	learn: 0.2996919	total: 1.02s	remaining: 3.21s
241:	learn: 0.2995662	total: 1.02s	remaining: 3.21s
242:	learn: 0.2994335	total: 1.03s	remaining: 3.21s
243:	learn: 0.2992733	total: 1.03s	remaining: 3.2s
244:	learn: 0.2991268	total: 1.04s	remaining: 3.2s
245:	learn: 0.2989916	total: 1.04s	remaining: 3.2s
246:	learn: 0.2988559	total: 1.05s	remaining: 3.19s
247:	learn: 0.2986889	total: 1.05s	remaining: 3.19s
248:	learn: 0.2985633	total: 1.06s	remaining: 3.19s
249:	learn: 0.2984288	total: 1.06s	remaining: 3.18s
250:	learn: 0.2982441	total: 1.06s	remaining: 3.18s
251:	learn: 0.2981229	total: 1.07s	remaining: 3.18s
252:	learn: 0.2979987	total: 1.07s	remaining: 3.17s
253:	learn: 0.2979187	total: 1.08s	remaining: 3.17s
254:	learn: 0.2977645	total: 1.08s	remaining: 3.16s
255:	learn: 0.2976080	total: 1.09s	remaining: 3.16s
256:	learn: 0.2974860	total: 1.09s	remaining: 3.15s
257:	learn: 0.2973700	total: 1.1s	remaining: 3.15s
258:	learn: 0.2972474	total: 1.1s	remaining: 3.15s
259:	learn: 0.2971211	total: 1.1s	remaining: 3.14s
260:	learn: 0.2969447	total: 1.11s	remaining: 3.14s
261:	learn: 0.2967648	total: 1.11s	remaining: 3.14s
262:	learn: 0.2966258	total: 1.12s	remaining: 3.13s
263:	learn: 0.2965041	total: 1.12s	remaining: 3.13s
264:	learn: 0.2963713	total: 1.13s	remaining: 3.13s
265:	learn: 0.2962378	total: 1.13s	remaining: 3.12s
266:	learn: 0.2961066	total: 1.14s	remaining: 3.12s
267:	learn: 0.2959533	total: 1.14s	remaining: 3.11s
268:	learn: 0.2958164	total: 1.14s	remaining: 3.11s
269:	learn: 0.2956933	total: 1.15s	remaining: 3.1s
270:	learn: 0.2955346	total: 1.15s	remaining: 3.1s
271:	learn: 0.2954459	total: 1.16s	remaining: 3.09s
272:	learn: 0.2952980	total: 1.16s	remaining: 3.09s
273:	learn: 0.2951405	total: 1.16s	remaining: 3.08s
274:	learn: 0.2949328	total: 1.17s	remaining: 3.08s
275:	learn: 0.2948250	total: 1.17s	remaining: 3.08s
276:	learn: 0.2946797	total: 1.18s	remaining: 3.07s
277:	learn: 0.2945507	total: 1.18s	remaining: 3.07s
278:	learn: 0.2944606	total: 1.18s	remaining: 3.06s
279:	learn: 0.2943469	total: 1.19s	remaining: 3.06s
280:	learn: 0.2942276	total: 1.19s	remaining: 3.05s
281:	learn: 0.2941250	total: 1.2s	remaining: 3.04s
282:	learn: 0.2940322	total: 1.2s	remaining: 3.04s
283:	learn: 0.2938898	total: 1.2s	remaining: 3.03s
284:	learn: 0.2937721	total: 1.21s	remaining: 3.03s
285:	learn: 0.2936457	total: 1.21s	remaining: 3.02s
286:	learn: 0.2934969	total: 1.22s	remaining: 3.02s
287:	learn: 0.2933848	total: 1.22s	remaining: 3.01s
288:	learn: 0.2932919	total: 1.22s	remaining: 3.01s
289:	learn: 0.2931659	total: 1.23s	remaining: 3s
290:	learn: 0.2930259	total: 1.23s	remaining: 3s
291:	learn: 0.2928408	total: 1.23s	remaining: 2.99s
292:	learn: 0.2927340	total: 1.24s	remaining: 2.99s
293:	learn: 0.2925810	total: 1.24s	remaining: 2.98s
294:	learn: 0.2924621	total: 1.25s	remaining: 2.98s
295:	learn: 0.2923906	total: 1.25s	remaining: 2.97s
296:	learn: 0.2922735	total: 1.25s	remaining: 2.96s
297:	learn: 0.2921439	total: 1.26s	remaining: 2.96s
298:	learn: 0.2920089	total: 1.26s	remaining: 2.96s
299:	learn: 0.2919205	total: 1.26s	remaining: 2.95s
300:	learn: 0.2917598	total: 1.27s	remaining: 2.94s
301:	learn: 0.2916527	total: 1.27s	remaining: 2.94s
302:	learn: 0.2915845	total: 1.28s	remaining: 2.94s
303:	learn: 0.2914644	total: 1.28s	remaining: 2.93s
304:	learn: 0.2913616	total: 1.28s	remaining: 2.92s
305:	learn: 0.2912464	total: 1.29s	remaining: 2.92s
306:	learn: 0.2911258	total: 1.29s	remaining: 2.92s
307:	learn: 0.2910063	total: 1.3s	remaining: 2.91s
308:	learn: 0.2908527	total: 1.3s	remaining: 2.91s
309:	learn: 0.2907051	total: 1.3s	remaining: 2.9s
310:	learn: 0.2905839	total: 1.31s	remaining: 2.9s
311:	learn: 0.2904719	total: 1.31s	remaining: 2.9s
312:	learn: 0.2903065	total: 1.32s	remaining: 2.89s
313:	learn: 0.2901888	total: 1.32s	remaining: 2.89s
314:	learn: 0.2901032	total: 1.33s	remaining: 2.88s
315:	learn: 0.2899993	total: 1.33s	remaining: 2.88s
316:	learn: 0.2899073	total: 1.33s	remaining: 2.87s
317:	learn: 0.2897655	total: 1.34s	remaining: 2.87s
318:	learn: 0.2896437	total: 1.34s	remaining: 2.87s
319:	learn: 0.2895416	total: 1.34s	remaining: 2.86s
320:	learn: 0.2893734	total: 1.35s	remaining: 2.85s
321:	learn: 0.2892312	total: 1.35s	remaining: 2.85s
322:	learn: 0.2891114	total: 1.36s	remaining: 2.84s
323:	learn: 0.2890125	total: 1.36s	remaining: 2.84s
324:	learn: 0.2888958	total: 1.36s	remaining: 2.83s
325:	learn: 0.2887688	total: 1.37s	remaining: 2.83s
326:	learn: 0.2886585	total: 1.37s	remaining: 2.82s
327:	learn: 0.2885670	total: 1.38s	remaining: 2.82s
328:	learn: 0.2884587	total: 1.38s	remaining: 2.81s
329:	learn: 0.2883356	total: 1.38s	remaining: 2.81s
330:	learn: 0.2882340	total: 1.39s	remaining: 2.8s
331:	learn: 0.2880954	total: 1.39s	remaining: 2.8s
332:	learn: 0.2879850	total: 1.4s	remaining: 2.8s
333:	learn: 0.2878413	total: 1.4s	remaining: 2.8s
334:	learn: 0.2877717	total: 1.41s	remaining: 2.79s
335:	learn: 0.2876688	total: 1.41s	remaining: 2.79s
336:	learn: 0.2875623	total: 1.41s	remaining: 2.78s
337:	learn: 0.2874812	total: 1.42s	remaining: 2.78s
338:	learn: 0.2873503	total: 1.42s	remaining: 2.77s
339:	learn: 0.2872445	total: 1.43s	remaining: 2.77s
340:	learn: 0.2871744	total: 1.43s	remaining: 2.77s
341:	learn: 0.2870806	total: 1.44s	remaining: 2.76s
342:	learn: 0.2869517	total: 1.44s	remaining: 2.76s
343:	learn: 0.2868048	total: 1.45s	remaining: 2.76s
344:	learn: 0.2867189	total: 1.45s	remaining: 2.75s
345:	learn: 0.2865958	total: 1.46s	remaining: 2.75s
346:	learn: 0.2864919	total: 1.46s	remaining: 2.75s
347:	learn: 0.2863663	total: 1.46s	remaining: 2.74s
348:	learn: 0.2862502	total: 1.47s	remaining: 2.74s
349:	learn: 0.2861178	total: 1.47s	remaining: 2.73s
350:	learn: 0.2860038	total: 1.48s	remaining: 2.73s
351:	learn: 0.2858938	total: 1.48s	remaining: 2.73s
352:	learn: 0.2857592	total: 1.49s	remaining: 2.72s
353:	learn: 0.2856331	total: 1.49s	remaining: 2.72s
354:	learn: 0.2855587	total: 1.49s	remaining: 2.71s
355:	learn: 0.2854815	total: 1.5s	remaining: 2.71s
356:	learn: 0.2853715	total: 1.5s	remaining: 2.71s
357:	learn: 0.2852614	total: 1.51s	remaining: 2.7s
358:	learn: 0.2851750	total: 1.51s	remaining: 2.7s
359:	learn: 0.2850935	total: 1.52s	remaining: 2.7s
360:	learn: 0.2850134	total: 1.52s	remaining: 2.69s
361:	learn: 0.2848905	total: 1.53s	remaining: 2.69s
362:	learn: 0.2848001	total: 1.53s	remaining: 2.69s
363:	learn: 0.2846846	total: 1.53s	remaining: 2.68s
364:	learn: 0.2845744	total: 1.54s	remaining: 2.68s
365:	learn: 0.2844380	total: 1.54s	remaining: 2.67s
366:	learn: 0.2843897	total: 1.55s	remaining: 2.67s
367:	learn: 0.2842791	total: 1.55s	remaining: 2.67s
368:	learn: 0.2841693	total: 1.56s	remaining: 2.66s
369:	learn: 0.2840593	total: 1.56s	remaining: 2.66s
370:	learn: 0.2839234	total: 1.56s	remaining: 2.65s
371:	learn: 0.2838101	total: 1.57s	remaining: 2.65s
372:	learn: 0.2837153	total: 1.57s	remaining: 2.65s
373:	learn: 0.2836188	total: 1.58s	remaining: 2.64s
374:	learn: 0.2834868	total: 1.58s	remaining: 2.64s
375:	learn: 0.2833811	total: 1.59s	remaining: 2.63s
376:	learn: 0.2832862	total: 1.59s	remaining: 2.63s
377:	learn: 0.2832208	total: 1.59s	remaining: 2.63s
378:	learn: 0.2831053	total: 1.6s	remaining: 2.62s
379:	learn: 0.2829980	total: 1.6s	remaining: 2.62s
380:	learn: 0.2829005	total: 1.61s	remaining: 2.61s
381:	learn: 0.2827874	total: 1.61s	remaining: 2.61s
382:	learn: 0.2826472	total: 1.61s	remaining: 2.6s
383:	learn: 0.2825027	total: 1.62s	remaining: 2.6s
384:	learn: 0.2823803	total: 1.62s	remaining: 2.59s
385:	learn: 0.2822710	total: 1.63s	remaining: 2.59s
386:	learn: 0.2822019	total: 1.63s	remaining: 2.58s
387:	learn: 0.2820997	total: 1.64s	remaining: 2.58s
388:	learn: 0.2820169	total: 1.64s	remaining: 2.58s
389:	learn: 0.2819087	total: 1.65s	remaining: 2.58s
390:	learn: 0.2817979	total: 1.65s	remaining: 2.57s
391:	learn: 0.2817384	total: 1.66s	remaining: 2.57s
392:	learn: 0.2816596	total: 1.66s	remaining: 2.57s
393:	learn: 0.2815338	total: 1.67s	remaining: 2.56s
394:	learn: 0.2814273	total: 1.67s	remaining: 2.56s
395:	learn: 0.2813418	total: 1.68s	remaining: 2.56s
396:	learn: 0.2812625	total: 1.68s	remaining: 2.55s
397:	learn: 0.2811735	total: 1.68s	remaining: 2.55s
398:	learn: 0.2810564	total: 1.69s	remaining: 2.54s
399:	learn: 0.2809509	total: 1.69s	remaining: 2.54s
400:	learn: 0.2808620	total: 1.7s	remaining: 2.54s
401:	learn: 0.2807635	total: 1.7s	remaining: 2.53s
402:	learn: 0.2806477	total: 1.71s	remaining: 2.53s
403:	learn: 0.2805320	total: 1.71s	remaining: 2.52s
404:	learn: 0.2804177	total: 1.71s	remaining: 2.52s
405:	learn: 0.2802957	total: 1.72s	remaining: 2.51s
406:	learn: 0.2801839	total: 1.72s	remaining: 2.51s
407:	learn: 0.2801059	total: 1.73s	remaining: 2.5s
408:	learn: 0.2800048	total: 1.73s	remaining: 2.5s
409:	learn: 0.2799274	total: 1.73s	remaining: 2.5s
410:	learn: 0.2798409	total: 1.74s	remaining: 2.49s
411:	learn: 0.2797301	total: 1.74s	remaining: 2.49s
412:	learn: 0.2796402	total: 1.75s	remaining: 2.48s
413:	learn: 0.2795239	total: 1.75s	remaining: 2.48s
414:	learn: 0.2793983	total: 1.75s	remaining: 2.47s
415:	learn: 0.2793229	total: 1.76s	remaining: 2.47s
416:	learn: 0.2792432	total: 1.76s	remaining: 2.46s
417:	learn: 0.2791375	total: 1.76s	remaining: 2.46s
418:	learn: 0.2790363	total: 1.77s	remaining: 2.45s
419:	learn: 0.2789690	total: 1.77s	remaining: 2.44s
420:	learn: 0.2788777	total: 1.77s	remaining: 2.44s
421:	learn: 0.2787596	total: 1.78s	remaining: 2.43s
422:	learn: 0.2786770	total: 1.78s	remaining: 2.43s
423:	learn: 0.2785680	total: 1.78s	remaining: 2.42s
424:	learn: 0.2784753	total: 1.79s	remaining: 2.42s
425:	learn: 0.2783320	total: 1.79s	remaining: 2.42s
426:	learn: 0.2782330	total: 1.8s	remaining: 2.41s
427:	learn: 0.2781478	total: 1.8s	remaining: 2.4s
428:	learn: 0.2780343	total: 1.8s	remaining: 2.4s
429:	learn: 0.2779485	total: 1.81s	remaining: 2.4s
430:	learn: 0.2778442	total: 1.81s	remaining: 2.39s
431:	learn: 0.2777890	total: 1.81s	remaining: 2.38s
432:	learn: 0.2777008	total: 1.82s	remaining: 2.38s
433:	learn: 0.2775467	total: 1.82s	remaining: 2.38s
434:	learn: 0.2774213	total: 1.82s	remaining: 2.37s
435:	learn: 0.2773378	total: 1.83s	remaining: 2.37s
436:	learn: 0.2772484	total: 1.83s	remaining: 2.36s
437:	learn: 0.2771761	total: 1.84s	remaining: 2.36s
438:	learn: 0.2771248	total: 1.84s	remaining: 2.35s
439:	learn: 0.2770215	total: 1.85s	remaining: 2.35s
440:	learn: 0.2769047	total: 1.85s	remaining: 2.35s
441:	learn: 0.2768121	total: 1.85s	remaining: 2.34s
442:	learn: 0.2767450	total: 1.86s	remaining: 2.34s
443:	learn: 0.2766549	total: 1.86s	remaining: 2.33s
444:	learn: 0.2765374	total: 1.87s	remaining: 2.33s
445:	learn: 0.2763994	total: 1.87s	remaining: 2.33s
446:	learn: 0.2762980	total: 1.88s	remaining: 2.32s
447:	learn: 0.2762100	total: 1.88s	remaining: 2.31s
448:	learn: 0.2761515	total: 1.88s	remaining: 2.31s
449:	learn: 0.2760551	total: 1.89s	remaining: 2.31s
450:	learn: 0.2759509	total: 1.89s	remaining: 2.3s
451:	learn: 0.2758460	total: 1.9s	remaining: 2.3s
452:	learn: 0.2756759	total: 1.9s	remaining: 2.29s
453:	learn: 0.2755784	total: 1.9s	remaining: 2.29s
454:	learn: 0.2754660	total: 1.91s	remaining: 2.28s
455:	learn: 0.2753900	total: 1.91s	remaining: 2.28s
456:	learn: 0.2752808	total: 1.91s	remaining: 2.27s
457:	learn: 0.2752016	total: 1.92s	remaining: 2.27s
458:	learn: 0.2751003	total: 1.92s	remaining: 2.27s
459:	learn: 0.2750385	total: 1.93s	remaining: 2.26s
460:	learn: 0.2749475	total: 1.93s	remaining: 2.26s
461:	learn: 0.2748545	total: 1.93s	remaining: 2.25s
462:	learn: 0.2747430	total: 1.94s	remaining: 2.25s
463:	learn: 0.2746721	total: 1.94s	remaining: 2.25s
464:	learn: 0.2745669	total: 1.95s	remaining: 2.24s
465:	learn: 0.2744593	total: 1.95s	remaining: 2.24s
466:	learn: 0.2743825	total: 1.96s	remaining: 2.23s
467:	learn: 0.2743156	total: 1.96s	remaining: 2.23s
468:	learn: 0.2742476	total: 1.97s	remaining: 2.23s
469:	learn: 0.2741453	total: 1.97s	remaining: 2.22s
470:	learn: 0.2740472	total: 1.97s	remaining: 2.22s
471:	learn: 0.2739474	total: 1.98s	remaining: 2.21s
472:	learn: 0.2738604	total: 1.98s	remaining: 2.21s
473:	learn: 0.2737646	total: 1.99s	remaining: 2.21s
474:	learn: 0.2737060	total: 1.99s	remaining: 2.2s
475:	learn: 0.2736154	total: 2s	remaining: 2.2s
476:	learn: 0.2735564	total: 2s	remaining: 2.19s
477:	learn: 0.2734684	total: 2s	remaining: 2.19s
478:	learn: 0.2733494	total: 2.01s	remaining: 2.18s
479:	learn: 0.2732569	total: 2.01s	remaining: 2.18s
480:	learn: 0.2731369	total: 2.02s	remaining: 2.17s
481:	learn: 0.2730120	total: 2.02s	remaining: 2.17s
482:	learn: 0.2729124	total: 2.02s	remaining: 2.17s
483:	learn: 0.2728156	total: 2.03s	remaining: 2.16s
484:	learn: 0.2727144	total: 2.03s	remaining: 2.16s
485:	learn: 0.2726240	total: 2.04s	remaining: 2.15s
486:	learn: 0.2725374	total: 2.04s	remaining: 2.15s
487:	learn: 0.2724037	total: 2.04s	remaining: 2.15s
488:	learn: 0.2723346	total: 2.05s	remaining: 2.14s
489:	learn: 0.2722392	total: 2.05s	remaining: 2.14s
490:	learn: 0.2721466	total: 2.06s	remaining: 2.13s
491:	learn: 0.2720533	total: 2.06s	remaining: 2.13s
492:	learn: 0.2719802	total: 2.07s	remaining: 2.12s
493:	learn: 0.2718979	total: 2.07s	remaining: 2.12s
494:	learn: 0.2718457	total: 2.07s	remaining: 2.12s
495:	learn: 0.2717484	total: 2.08s	remaining: 2.11s
496:	learn: 0.2716901	total: 2.08s	remaining: 2.11s
497:	learn: 0.2715953	total: 2.08s	remaining: 2.1s
498:	learn: 0.2714938	total: 2.09s	remaining: 2.1s
499:	learn: 0.2714221	total: 2.09s	remaining: 2.09s
500:	learn: 0.2713292	total: 2.1s	remaining: 2.09s
501:	learn: 0.2712304	total: 2.1s	remaining: 2.08s
502:	learn: 0.2711099	total: 2.11s	remaining: 2.08s
503:	learn: 0.2710413	total: 2.11s	remaining: 2.08s
504:	learn: 0.2709466	total: 2.12s	remaining: 2.07s
505:	learn: 0.2708917	total: 2.12s	remaining: 2.07s
506:	learn: 0.2708009	total: 2.12s	remaining: 2.06s
507:	learn: 0.2707022	total: 2.13s	remaining: 2.06s
508:	learn: 0.2705792	total: 2.13s	remaining: 2.06s
509:	learn: 0.2704757	total: 2.14s	remaining: 2.05s
510:	learn: 0.2703824	total: 2.14s	remaining: 2.05s
511:	learn: 0.2703132	total: 2.14s	remaining: 2.04s
512:	learn: 0.2702455	total: 2.15s	remaining: 2.04s
513:	learn: 0.2701450	total: 2.15s	remaining: 2.04s
514:	learn: 0.2700363	total: 2.16s	remaining: 2.03s
515:	learn: 0.2699729	total: 2.16s	remaining: 2.03s
516:	learn: 0.2698658	total: 2.17s	remaining: 2.02s
517:	learn: 0.2697903	total: 2.17s	remaining: 2.02s
518:	learn: 0.2697124	total: 2.17s	remaining: 2.01s
519:	learn: 0.2696521	total: 2.18s	remaining: 2.01s
520:	learn: 0.2695773	total: 2.18s	remaining: 2.01s
521:	learn: 0.2694917	total: 2.19s	remaining: 2s
522:	learn: 0.2694061	total: 2.19s	remaining: 2s
523:	learn: 0.2692885	total: 2.19s	remaining: 1.99s
524:	learn: 0.2692192	total: 2.2s	remaining: 1.99s
525:	learn: 0.2691515	total: 2.2s	remaining: 1.98s
526:	learn: 0.2690332	total: 2.2s	remaining: 1.98s
527:	learn: 0.2689579	total: 2.21s	remaining: 1.97s
528:	learn: 0.2688622	total: 2.21s	remaining: 1.97s
529:	learn: 0.2687710	total: 2.22s	remaining: 1.97s
530:	learn: 0.2686977	total: 2.22s	remaining: 1.96s
531:	learn: 0.2686074	total: 2.22s	remaining: 1.96s
532:	learn: 0.2685171	total: 2.23s	remaining: 1.95s
533:	learn: 0.2683967	total: 2.23s	remaining: 1.95s
534:	learn: 0.2683026	total: 2.23s	remaining: 1.94s
535:	learn: 0.2682197	total: 2.24s	remaining: 1.94s
536:	learn: 0.2681266	total: 2.24s	remaining: 1.93s
537:	learn: 0.2680420	total: 2.25s	remaining: 1.93s
538:	learn: 0.2679685	total: 2.25s	remaining: 1.93s
539:	learn: 0.2678893	total: 2.25s	remaining: 1.92s
540:	learn: 0.2678032	total: 2.26s	remaining: 1.92s
541:	learn: 0.2676691	total: 2.26s	remaining: 1.91s
542:	learn: 0.2675687	total: 2.27s	remaining: 1.91s
543:	learn: 0.2674886	total: 2.27s	remaining: 1.9s
544:	learn: 0.2674296	total: 2.27s	remaining: 1.9s
545:	learn: 0.2673748	total: 2.28s	remaining: 1.89s
546:	learn: 0.2672750	total: 2.29s	remaining: 1.89s
547:	learn: 0.2671715	total: 2.29s	remaining: 1.89s
548:	learn: 0.2670817	total: 2.29s	remaining: 1.88s
549:	learn: 0.2670161	total: 2.3s	remaining: 1.88s
550:	learn: 0.2669392	total: 2.3s	remaining: 1.88s
551:	learn: 0.2668635	total: 2.31s	remaining: 1.87s
552:	learn: 0.2667563	total: 2.31s	remaining: 1.87s
553:	learn: 0.2666698	total: 2.31s	remaining: 1.86s
554:	learn: 0.2665804	total: 2.32s	remaining: 1.86s
555:	learn: 0.2664783	total: 2.32s	remaining: 1.85s
556:	learn: 0.2663820	total: 2.33s	remaining: 1.85s
557:	learn: 0.2662862	total: 2.33s	remaining: 1.85s
558:	learn: 0.2662093	total: 2.34s	remaining: 1.84s
559:	learn: 0.2661442	total: 2.34s	remaining: 1.84s
560:	learn: 0.2660665	total: 2.35s	remaining: 1.84s
561:	learn: 0.2660074	total: 2.35s	remaining: 1.83s
562:	learn: 0.2659399	total: 2.35s	remaining: 1.83s
563:	learn: 0.2658807	total: 2.36s	remaining: 1.82s
564:	learn: 0.2658032	total: 2.36s	remaining: 1.82s
565:	learn: 0.2657243	total: 2.37s	remaining: 1.82s
566:	learn: 0.2656665	total: 2.37s	remaining: 1.81s
567:	learn: 0.2655638	total: 2.38s	remaining: 1.81s
568:	learn: 0.2654717	total: 2.38s	remaining: 1.8s
569:	learn: 0.2653810	total: 2.38s	remaining: 1.8s
570:	learn: 0.2653075	total: 2.39s	remaining: 1.79s
571:	learn: 0.2652602	total: 2.39s	remaining: 1.79s
572:	learn: 0.2651691	total: 2.4s	remaining: 1.79s
573:	learn: 0.2650732	total: 2.4s	remaining: 1.78s
574:	learn: 0.2650051	total: 2.41s	remaining: 1.78s
575:	learn: 0.2649044	total: 2.41s	remaining: 1.77s
576:	learn: 0.2648321	total: 2.42s	remaining: 1.77s
577:	learn: 0.2647699	total: 2.42s	remaining: 1.77s
578:	learn: 0.2646678	total: 2.42s	remaining: 1.76s
579:	learn: 0.2645657	total: 2.43s	remaining: 1.76s
580:	learn: 0.2644747	total: 2.43s	remaining: 1.75s
581:	learn: 0.2643644	total: 2.44s	remaining: 1.75s
582:	learn: 0.2642932	total: 2.44s	remaining: 1.75s
583:	learn: 0.2641838	total: 2.44s	remaining: 1.74s
584:	learn: 0.2640861	total: 2.45s	remaining: 1.74s
585:	learn: 0.2640069	total: 2.46s	remaining: 1.73s
586:	learn: 0.2639413	total: 2.46s	remaining: 1.73s
587:	learn: 0.2638439	total: 2.46s	remaining: 1.73s
588:	learn: 0.2637865	total: 2.47s	remaining: 1.72s
589:	learn: 0.2636732	total: 2.47s	remaining: 1.72s
590:	learn: 0.2635862	total: 2.48s	remaining: 1.71s
591:	learn: 0.2635220	total: 2.48s	remaining: 1.71s
592:	learn: 0.2634297	total: 2.49s	remaining: 1.71s
593:	learn: 0.2633701	total: 2.49s	remaining: 1.7s
594:	learn: 0.2632990	total: 2.5s	remaining: 1.7s
595:	learn: 0.2632279	total: 2.5s	remaining: 1.69s
596:	learn: 0.2631418	total: 2.5s	remaining: 1.69s
597:	learn: 0.2630801	total: 2.51s	remaining: 1.69s
598:	learn: 0.2630023	total: 2.51s	remaining: 1.68s
599:	learn: 0.2628887	total: 2.52s	remaining: 1.68s
600:	learn: 0.2627671	total: 2.52s	remaining: 1.67s
601:	learn: 0.2626661	total: 2.52s	remaining: 1.67s
602:	learn: 0.2626010	total: 2.53s	remaining: 1.67s
603:	learn: 0.2625417	total: 2.53s	remaining: 1.66s
604:	learn: 0.2624933	total: 2.54s	remaining: 1.66s
605:	learn: 0.2624116	total: 2.54s	remaining: 1.65s
606:	learn: 0.2623078	total: 2.55s	remaining: 1.65s
607:	learn: 0.2622045	total: 2.55s	remaining: 1.65s
608:	learn: 0.2621497	total: 2.56s	remaining: 1.64s
609:	learn: 0.2620640	total: 2.56s	remaining: 1.64s
610:	learn: 0.2619887	total: 2.56s	remaining: 1.63s
611:	learn: 0.2619319	total: 2.57s	remaining: 1.63s
612:	learn: 0.2618741	total: 2.57s	remaining: 1.62s
613:	learn: 0.2617641	total: 2.58s	remaining: 1.62s
614:	learn: 0.2616905	total: 2.58s	remaining: 1.62s
615:	learn: 0.2615864	total: 2.59s	remaining: 1.61s
616:	learn: 0.2615026	total: 2.59s	remaining: 1.61s
617:	learn: 0.2614229	total: 2.6s	remaining: 1.6s
618:	learn: 0.2613595	total: 2.6s	remaining: 1.6s
619:	learn: 0.2612752	total: 2.6s	remaining: 1.6s
620:	learn: 0.2612022	total: 2.61s	remaining: 1.59s
621:	learn: 0.2610886	total: 2.61s	remaining: 1.59s
622:	learn: 0.2609855	total: 2.62s	remaining: 1.58s
623:	learn: 0.2609272	total: 2.62s	remaining: 1.58s
624:	learn: 0.2608400	total: 2.63s	remaining: 1.57s
625:	learn: 0.2607729	total: 2.63s	remaining: 1.57s
626:	learn: 0.2606965	total: 2.63s	remaining: 1.57s
627:	learn: 0.2606248	total: 2.64s	remaining: 1.56s
628:	learn: 0.2605584	total: 2.64s	remaining: 1.56s
629:	learn: 0.2604487	total: 2.64s	remaining: 1.55s
630:	learn: 0.2604032	total: 2.65s	remaining: 1.55s
631:	learn: 0.2602800	total: 2.65s	remaining: 1.54s
632:	learn: 0.2601716	total: 2.66s	remaining: 1.54s
633:	learn: 0.2601138	total: 2.66s	remaining: 1.53s
634:	learn: 0.2600417	total: 2.67s	remaining: 1.53s
635:	learn: 0.2599796	total: 2.67s	remaining: 1.53s
636:	learn: 0.2598948	total: 2.67s	remaining: 1.52s
637:	learn: 0.2598364	total: 2.68s	remaining: 1.52s
638:	learn: 0.2597313	total: 2.68s	remaining: 1.51s
639:	learn: 0.2596506	total: 2.69s	remaining: 1.51s
640:	learn: 0.2595673	total: 2.69s	remaining: 1.51s
641:	learn: 0.2594797	total: 2.7s	remaining: 1.5s
642:	learn: 0.2593988	total: 2.7s	remaining: 1.5s
643:	learn: 0.2593381	total: 2.71s	remaining: 1.5s
644:	learn: 0.2592649	total: 2.71s	remaining: 1.49s
645:	learn: 0.2591955	total: 2.71s	remaining: 1.49s
646:	learn: 0.2590969	total: 2.72s	remaining: 1.48s
647:	learn: 0.2589766	total: 2.72s	remaining: 1.48s
648:	learn: 0.2589018	total: 2.73s	remaining: 1.48s
649:	learn: 0.2588305	total: 2.73s	remaining: 1.47s
650:	learn: 0.2587540	total: 2.74s	remaining: 1.47s
651:	learn: 0.2586709	total: 2.74s	remaining: 1.46s
652:	learn: 0.2585745	total: 2.75s	remaining: 1.46s
653:	learn: 0.2584660	total: 2.75s	remaining: 1.45s
654:	learn: 0.2583782	total: 2.75s	remaining: 1.45s
655:	learn: 0.2583197	total: 2.76s	remaining: 1.45s
656:	learn: 0.2582597	total: 2.76s	remaining: 1.44s
657:	learn: 0.2581569	total: 2.77s	remaining: 1.44s
658:	learn: 0.2580980	total: 2.77s	remaining: 1.43s
659:	learn: 0.2580491	total: 2.77s	remaining: 1.43s
660:	learn: 0.2579988	total: 2.78s	remaining: 1.42s
661:	learn: 0.2579522	total: 2.78s	remaining: 1.42s
662:	learn: 0.2578897	total: 2.78s	remaining: 1.42s
663:	learn: 0.2577980	total: 2.79s	remaining: 1.41s
664:	learn: 0.2577218	total: 2.79s	remaining: 1.41s
665:	learn: 0.2576662	total: 2.79s	remaining: 1.4s
666:	learn: 0.2575884	total: 2.8s	remaining: 1.4s
667:	learn: 0.2575046	total: 2.8s	remaining: 1.39s
668:	learn: 0.2574173	total: 2.81s	remaining: 1.39s
669:	learn: 0.2573418	total: 2.81s	remaining: 1.39s
670:	learn: 0.2572641	total: 2.82s	remaining: 1.38s
671:	learn: 0.2572083	total: 2.82s	remaining: 1.38s
672:	learn: 0.2571437	total: 2.82s	remaining: 1.37s
673:	learn: 0.2570957	total: 2.83s	remaining: 1.37s
674:	learn: 0.2570302	total: 2.83s	remaining: 1.36s
675:	learn: 0.2569393	total: 2.83s	remaining: 1.36s
676:	learn: 0.2568790	total: 2.84s	remaining: 1.35s
677:	learn: 0.2567990	total: 2.84s	remaining: 1.35s
678:	learn: 0.2567353	total: 2.85s	remaining: 1.35s
679:	learn: 0.2566528	total: 2.85s	remaining: 1.34s
680:	learn: 0.2565362	total: 2.86s	remaining: 1.34s
681:	learn: 0.2565348	total: 2.86s	remaining: 1.33s
682:	learn: 0.2564520	total: 2.87s	remaining: 1.33s
683:	learn: 0.2563562	total: 2.87s	remaining: 1.33s
684:	learn: 0.2562962	total: 2.87s	remaining: 1.32s
685:	learn: 0.2562275	total: 2.88s	remaining: 1.32s
686:	learn: 0.2561540	total: 2.88s	remaining: 1.31s
687:	learn: 0.2560764	total: 2.88s	remaining: 1.31s
688:	learn: 0.2560127	total: 2.89s	remaining: 1.3s
689:	learn: 0.2559403	total: 2.89s	remaining: 1.3s
690:	learn: 0.2558942	total: 2.9s	remaining: 1.29s
691:	learn: 0.2557928	total: 2.9s	remaining: 1.29s
692:	learn: 0.2557268	total: 2.91s	remaining: 1.29s
693:	learn: 0.2556337	total: 2.91s	remaining: 1.28s
694:	learn: 0.2555364	total: 2.92s	remaining: 1.28s
695:	learn: 0.2554664	total: 2.92s	remaining: 1.27s
696:	learn: 0.2554069	total: 2.92s	remaining: 1.27s
697:	learn: 0.2553308	total: 2.93s	remaining: 1.27s
698:	learn: 0.2552399	total: 2.93s	remaining: 1.26s
699:	learn: 0.2551740	total: 2.94s	remaining: 1.26s
700:	learn: 0.2551247	total: 2.94s	remaining: 1.25s
701:	learn: 0.2550436	total: 2.94s	remaining: 1.25s
702:	learn: 0.2549576	total: 2.95s	remaining: 1.25s
703:	learn: 0.2548809	total: 2.95s	remaining: 1.24s
704:	learn: 0.2547837	total: 2.96s	remaining: 1.24s
705:	learn: 0.2546867	total: 2.96s	remaining: 1.23s
706:	learn: 0.2546453	total: 2.96s	remaining: 1.23s
707:	learn: 0.2545959	total: 2.97s	remaining: 1.22s
708:	learn: 0.2545109	total: 2.97s	remaining: 1.22s
709:	learn: 0.2544409	total: 2.98s	remaining: 1.22s
710:	learn: 0.2543732	total: 2.98s	remaining: 1.21s
711:	learn: 0.2543044	total: 2.98s	remaining: 1.21s
712:	learn: 0.2542331	total: 2.99s	remaining: 1.2s
713:	learn: 0.2541587	total: 3s	remaining: 1.2s
714:	learn: 0.2540463	total: 3s	remaining: 1.2s
715:	learn: 0.2539763	total: 3s	remaining: 1.19s
716:	learn: 0.2538825	total: 3.01s	remaining: 1.19s
717:	learn: 0.2537956	total: 3.01s	remaining: 1.18s
718:	learn: 0.2537093	total: 3.02s	remaining: 1.18s
719:	learn: 0.2536121	total: 3.02s	remaining: 1.18s
720:	learn: 0.2535295	total: 3.03s	remaining: 1.17s
721:	learn: 0.2534413	total: 3.03s	remaining: 1.17s
722:	learn: 0.2533583	total: 3.03s	remaining: 1.16s
723:	learn: 0.2532685	total: 3.04s	remaining: 1.16s
724:	learn: 0.2531496	total: 3.04s	remaining: 1.15s
725:	learn: 0.2530524	total: 3.05s	remaining: 1.15s
726:	learn: 0.2529533	total: 3.05s	remaining: 1.15s
727:	learn: 0.2528737	total: 3.06s	remaining: 1.14s
728:	learn: 0.2528302	total: 3.06s	remaining: 1.14s
729:	learn: 0.2527591	total: 3.07s	remaining: 1.13s
730:	learn: 0.2526825	total: 3.07s	remaining: 1.13s
731:	learn: 0.2525965	total: 3.08s	remaining: 1.13s
732:	learn: 0.2525305	total: 3.08s	remaining: 1.12s
733:	learn: 0.2524657	total: 3.08s	remaining: 1.12s
734:	learn: 0.2523837	total: 3.09s	remaining: 1.11s
735:	learn: 0.2523183	total: 3.09s	remaining: 1.11s
736:	learn: 0.2522502	total: 3.1s	remaining: 1.1s
737:	learn: 0.2521753	total: 3.1s	remaining: 1.1s
738:	learn: 0.2520944	total: 3.1s	remaining: 1.1s
739:	learn: 0.2520070	total: 3.11s	remaining: 1.09s
740:	learn: 0.2519511	total: 3.11s	remaining: 1.09s
741:	learn: 0.2518490	total: 3.12s	remaining: 1.08s
742:	learn: 0.2517718	total: 3.12s	remaining: 1.08s
743:	learn: 0.2517062	total: 3.13s	remaining: 1.08s
744:	learn: 0.2516486	total: 3.13s	remaining: 1.07s
745:	learn: 0.2515493	total: 3.14s	remaining: 1.07s
746:	learn: 0.2514559	total: 3.14s	remaining: 1.06s
747:	learn: 0.2513895	total: 3.15s	remaining: 1.06s
748:	learn: 0.2513149	total: 3.15s	remaining: 1.05s
749:	learn: 0.2513136	total: 3.15s	remaining: 1.05s
750:	learn: 0.2512684	total: 3.16s	remaining: 1.05s
751:	learn: 0.2511798	total: 3.16s	remaining: 1.04s
752:	learn: 0.2511234	total: 3.17s	remaining: 1.04s
753:	learn: 0.2510476	total: 3.17s	remaining: 1.03s
754:	learn: 0.2509854	total: 3.17s	remaining: 1.03s
755:	learn: 0.2509183	total: 3.18s	remaining: 1.03s
756:	learn: 0.2508546	total: 3.18s	remaining: 1.02s
757:	learn: 0.2507934	total: 3.19s	remaining: 1.02s
758:	learn: 0.2506985	total: 3.19s	remaining: 1.01s
759:	learn: 0.2506252	total: 3.2s	remaining: 1.01s
760:	learn: 0.2505662	total: 3.2s	remaining: 1s
761:	learn: 0.2505377	total: 3.2s	remaining: 1s
762:	learn: 0.2504412	total: 3.21s	remaining: 996ms
763:	learn: 0.2503752	total: 3.21s	remaining: 992ms
764:	learn: 0.2502922	total: 3.21s	remaining: 988ms
765:	learn: 0.2501896	total: 3.22s	remaining: 984ms
766:	learn: 0.2501321	total: 3.22s	remaining: 979ms
767:	learn: 0.2500576	total: 3.23s	remaining: 975ms
768:	learn: 0.2499812	total: 3.23s	remaining: 971ms
769:	learn: 0.2499045	total: 3.23s	remaining: 967ms
770:	learn: 0.2498125	total: 3.24s	remaining: 962ms
771:	learn: 0.2497465	total: 3.24s	remaining: 958ms
772:	learn: 0.2497165	total: 3.25s	remaining: 954ms
773:	learn: 0.2496206	total: 3.25s	remaining: 950ms
774:	learn: 0.2495427	total: 3.26s	remaining: 946ms
775:	learn: 0.2494705	total: 3.26s	remaining: 941ms
776:	learn: 0.2494153	total: 3.26s	remaining: 937ms
777:	learn: 0.2493142	total: 3.27s	remaining: 933ms
778:	learn: 0.2492542	total: 3.27s	remaining: 928ms
779:	learn: 0.2491685	total: 3.28s	remaining: 924ms
780:	learn: 0.2491157	total: 3.28s	remaining: 920ms
781:	learn: 0.2490817	total: 3.29s	remaining: 916ms
782:	learn: 0.2490039	total: 3.29s	remaining: 912ms
783:	learn: 0.2489112	total: 3.29s	remaining: 908ms
784:	learn: 0.2488345	total: 3.3s	remaining: 904ms
785:	learn: 0.2487465	total: 3.3s	remaining: 899ms
786:	learn: 0.2486911	total: 3.31s	remaining: 895ms
787:	learn: 0.2486511	total: 3.31s	remaining: 891ms
788:	learn: 0.2485685	total: 3.32s	remaining: 887ms
789:	learn: 0.2484846	total: 3.32s	remaining: 883ms
790:	learn: 0.2484114	total: 3.33s	remaining: 879ms
791:	learn: 0.2483691	total: 3.33s	remaining: 874ms
792:	learn: 0.2482811	total: 3.33s	remaining: 870ms
793:	learn: 0.2482117	total: 3.34s	remaining: 866ms
794:	learn: 0.2481320	total: 3.34s	remaining: 862ms
795:	learn: 0.2480752	total: 3.35s	remaining: 857ms
796:	learn: 0.2480160	total: 3.35s	remaining: 853ms
797:	learn: 0.2479381	total: 3.35s	remaining: 849ms
798:	learn: 0.2478744	total: 3.36s	remaining: 845ms
799:	learn: 0.2477995	total: 3.36s	remaining: 841ms
800:	learn: 0.2477533	total: 3.37s	remaining: 836ms
801:	learn: 0.2476784	total: 3.37s	remaining: 832ms
802:	learn: 0.2475958	total: 3.38s	remaining: 828ms
803:	learn: 0.2475071	total: 3.38s	remaining: 824ms
804:	learn: 0.2474310	total: 3.38s	remaining: 820ms
805:	learn: 0.2473523	total: 3.39s	remaining: 816ms
806:	learn: 0.2472585	total: 3.39s	remaining: 811ms
807:	learn: 0.2471860	total: 3.4s	remaining: 807ms
808:	learn: 0.2470923	total: 3.4s	remaining: 803ms
809:	learn: 0.2470479	total: 3.4s	remaining: 799ms
810:	learn: 0.2469871	total: 3.41s	remaining: 794ms
811:	learn: 0.2469111	total: 3.41s	remaining: 790ms
812:	learn: 0.2468368	total: 3.42s	remaining: 786ms
813:	learn: 0.2467490	total: 3.42s	remaining: 782ms
814:	learn: 0.2466871	total: 3.43s	remaining: 778ms
815:	learn: 0.2466368	total: 3.43s	remaining: 774ms
816:	learn: 0.2465712	total: 3.43s	remaining: 769ms
817:	learn: 0.2465702	total: 3.44s	remaining: 765ms
818:	learn: 0.2465081	total: 3.44s	remaining: 761ms
819:	learn: 0.2464299	total: 3.45s	remaining: 757ms
820:	learn: 0.2463558	total: 3.45s	remaining: 752ms
821:	learn: 0.2462679	total: 3.46s	remaining: 748ms
822:	learn: 0.2461820	total: 3.46s	remaining: 744ms
823:	learn: 0.2461227	total: 3.46s	remaining: 740ms
824:	learn: 0.2460766	total: 3.47s	remaining: 736ms
825:	learn: 0.2459978	total: 3.47s	remaining: 732ms
826:	learn: 0.2459008	total: 3.48s	remaining: 728ms
827:	learn: 0.2458598	total: 3.48s	remaining: 723ms
828:	learn: 0.2458032	total: 3.49s	remaining: 719ms
829:	learn: 0.2457474	total: 3.49s	remaining: 715ms
830:	learn: 0.2456800	total: 3.49s	remaining: 711ms
831:	learn: 0.2456211	total: 3.5s	remaining: 706ms
832:	learn: 0.2455481	total: 3.5s	remaining: 702ms
833:	learn: 0.2454827	total: 3.51s	remaining: 698ms
834:	learn: 0.2454159	total: 3.51s	remaining: 694ms
835:	learn: 0.2453318	total: 3.52s	remaining: 690ms
836:	learn: 0.2452714	total: 3.52s	remaining: 685ms
837:	learn: 0.2452522	total: 3.52s	remaining: 681ms
838:	learn: 0.2451975	total: 3.53s	remaining: 677ms
839:	learn: 0.2451282	total: 3.53s	remaining: 673ms
840:	learn: 0.2450544	total: 3.54s	remaining: 669ms
841:	learn: 0.2449850	total: 3.54s	remaining: 664ms
842:	learn: 0.2449140	total: 3.54s	remaining: 660ms
843:	learn: 0.2448555	total: 3.55s	remaining: 656ms
844:	learn: 0.2447860	total: 3.55s	remaining: 652ms
845:	learn: 0.2446988	total: 3.56s	remaining: 648ms
846:	learn: 0.2446106	total: 3.56s	remaining: 643ms
847:	learn: 0.2445743	total: 3.57s	remaining: 639ms
848:	learn: 0.2445013	total: 3.57s	remaining: 635ms
849:	learn: 0.2444238	total: 3.57s	remaining: 631ms
850:	learn: 0.2443663	total: 3.58s	remaining: 627ms
851:	learn: 0.2442723	total: 3.58s	remaining: 622ms
852:	learn: 0.2442002	total: 3.59s	remaining: 618ms
853:	learn: 0.2441599	total: 3.59s	remaining: 614ms
854:	learn: 0.2440745	total: 3.59s	remaining: 610ms
855:	learn: 0.2439977	total: 3.6s	remaining: 606ms
856:	learn: 0.2438879	total: 3.6s	remaining: 601ms
857:	learn: 0.2438133	total: 3.61s	remaining: 597ms
858:	learn: 0.2437315	total: 3.61s	remaining: 593ms
859:	learn: 0.2436847	total: 3.62s	remaining: 589ms
860:	learn: 0.2435868	total: 3.62s	remaining: 585ms
861:	learn: 0.2435216	total: 3.63s	remaining: 581ms
862:	learn: 0.2434680	total: 3.63s	remaining: 576ms
863:	learn: 0.2434088	total: 3.63s	remaining: 572ms
864:	learn: 0.2433094	total: 3.64s	remaining: 568ms
865:	learn: 0.2432401	total: 3.64s	remaining: 563ms
866:	learn: 0.2431588	total: 3.65s	remaining: 559ms
867:	learn: 0.2431245	total: 3.65s	remaining: 555ms
868:	learn: 0.2430456	total: 3.65s	remaining: 551ms
869:	learn: 0.2430049	total: 3.65s	remaining: 546ms
870:	learn: 0.2429557	total: 3.66s	remaining: 542ms
871:	learn: 0.2429039	total: 3.66s	remaining: 538ms
872:	learn: 0.2428469	total: 3.67s	remaining: 533ms
873:	learn: 0.2427800	total: 3.67s	remaining: 529ms
874:	learn: 0.2427072	total: 3.67s	remaining: 525ms
875:	learn: 0.2426470	total: 3.68s	remaining: 521ms
876:	learn: 0.2425661	total: 3.68s	remaining: 517ms
877:	learn: 0.2424761	total: 3.69s	remaining: 512ms
878:	learn: 0.2424047	total: 3.69s	remaining: 508ms
879:	learn: 0.2423640	total: 3.69s	remaining: 504ms
880:	learn: 0.2422972	total: 3.7s	remaining: 500ms
881:	learn: 0.2422423	total: 3.7s	remaining: 496ms
882:	learn: 0.2421725	total: 3.71s	remaining: 491ms
883:	learn: 0.2421203	total: 3.71s	remaining: 487ms
884:	learn: 0.2420577	total: 3.72s	remaining: 483ms
885:	learn: 0.2419866	total: 3.72s	remaining: 479ms
886:	learn: 0.2419418	total: 3.73s	remaining: 475ms
887:	learn: 0.2418731	total: 3.73s	remaining: 470ms
888:	learn: 0.2418245	total: 3.73s	remaining: 466ms
889:	learn: 0.2417391	total: 3.74s	remaining: 462ms
890:	learn: 0.2416716	total: 3.74s	remaining: 458ms
891:	learn: 0.2415698	total: 3.75s	remaining: 454ms
892:	learn: 0.2414988	total: 3.75s	remaining: 449ms
893:	learn: 0.2414565	total: 3.75s	remaining: 445ms
894:	learn: 0.2414031	total: 3.76s	remaining: 441ms
895:	learn: 0.2413305	total: 3.76s	remaining: 437ms
896:	learn: 0.2412301	total: 3.77s	remaining: 432ms
897:	learn: 0.2411568	total: 3.77s	remaining: 428ms
898:	learn: 0.2410888	total: 3.77s	remaining: 424ms
899:	learn: 0.2410249	total: 3.78s	remaining: 420ms
900:	learn: 0.2409456	total: 3.78s	remaining: 415ms
901:	learn: 0.2408959	total: 3.78s	remaining: 411ms
902:	learn: 0.2408368	total: 3.79s	remaining: 407ms
903:	learn: 0.2407655	total: 3.79s	remaining: 403ms
904:	learn: 0.2406809	total: 3.8s	remaining: 399ms
905:	learn: 0.2405995	total: 3.8s	remaining: 394ms
906:	learn: 0.2405219	total: 3.8s	remaining: 390ms
907:	learn: 0.2404525	total: 3.81s	remaining: 386ms
908:	learn: 0.2403714	total: 3.81s	remaining: 382ms
909:	learn: 0.2402894	total: 3.82s	remaining: 378ms
910:	learn: 0.2402492	total: 3.82s	remaining: 373ms
911:	learn: 0.2401718	total: 3.83s	remaining: 369ms
912:	learn: 0.2400965	total: 3.83s	remaining: 365ms
913:	learn: 0.2400241	total: 3.83s	remaining: 361ms
914:	learn: 0.2399634	total: 3.84s	remaining: 357ms
915:	learn: 0.2399018	total: 3.84s	remaining: 352ms
916:	learn: 0.2398600	total: 3.84s	remaining: 348ms
917:	learn: 0.2398106	total: 3.85s	remaining: 344ms
918:	learn: 0.2397701	total: 3.85s	remaining: 340ms
919:	learn: 0.2397055	total: 3.86s	remaining: 335ms
920:	learn: 0.2396079	total: 3.86s	remaining: 331ms
921:	learn: 0.2395288	total: 3.87s	remaining: 327ms
922:	learn: 0.2394586	total: 3.87s	remaining: 323ms
923:	learn: 0.2393794	total: 3.88s	remaining: 319ms
924:	learn: 0.2393135	total: 3.88s	remaining: 315ms
925:	learn: 0.2392673	total: 3.88s	remaining: 310ms
926:	learn: 0.2392190	total: 3.89s	remaining: 306ms
927:	learn: 0.2391386	total: 3.89s	remaining: 302ms
928:	learn: 0.2390718	total: 3.9s	remaining: 298ms
929:	learn: 0.2390305	total: 3.9s	remaining: 294ms
930:	learn: 0.2389384	total: 3.9s	remaining: 289ms
931:	learn: 0.2388646	total: 3.91s	remaining: 285ms
932:	learn: 0.2387824	total: 3.91s	remaining: 281ms
933:	learn: 0.2387134	total: 3.92s	remaining: 277ms
934:	learn: 0.2386481	total: 3.92s	remaining: 273ms
935:	learn: 0.2386223	total: 3.93s	remaining: 268ms
936:	learn: 0.2385524	total: 3.93s	remaining: 264ms
937:	learn: 0.2384586	total: 3.93s	remaining: 260ms
938:	learn: 0.2384056	total: 3.94s	remaining: 256ms
939:	learn: 0.2383583	total: 3.94s	remaining: 252ms
940:	learn: 0.2383157	total: 3.94s	remaining: 247ms
941:	learn: 0.2382465	total: 3.95s	remaining: 243ms
942:	learn: 0.2381637	total: 3.95s	remaining: 239ms
943:	learn: 0.2380875	total: 3.96s	remaining: 235ms
944:	learn: 0.2380253	total: 3.96s	remaining: 231ms
945:	learn: 0.2379596	total: 3.97s	remaining: 226ms
946:	learn: 0.2379280	total: 3.97s	remaining: 222ms
947:	learn: 0.2378648	total: 3.97s	remaining: 218ms
948:	learn: 0.2377774	total: 3.98s	remaining: 214ms
949:	learn: 0.2376773	total: 3.98s	remaining: 210ms
950:	learn: 0.2376082	total: 3.99s	remaining: 205ms
951:	learn: 0.2375173	total: 3.99s	remaining: 201ms
952:	learn: 0.2374867	total: 3.99s	remaining: 197ms
953:	learn: 0.2374170	total: 4s	remaining: 193ms
954:	learn: 0.2373635	total: 4s	remaining: 189ms
955:	learn: 0.2373307	total: 4s	remaining: 184ms
956:	learn: 0.2372607	total: 4.01s	remaining: 180ms
957:	learn: 0.2372029	total: 4.01s	remaining: 176ms
958:	learn: 0.2371279	total: 4.02s	remaining: 172ms
959:	learn: 0.2370564	total: 4.02s	remaining: 168ms
960:	learn: 0.2369848	total: 4.03s	remaining: 163ms
961:	learn: 0.2369173	total: 4.03s	remaining: 159ms
962:	learn: 0.2368479	total: 4.03s	remaining: 155ms
963:	learn: 0.2367762	total: 4.04s	remaining: 151ms
964:	learn: 0.2367142	total: 4.04s	remaining: 147ms
965:	learn: 0.2366359	total: 4.04s	remaining: 142ms
966:	learn: 0.2365622	total: 4.05s	remaining: 138ms
967:	learn: 0.2364739	total: 4.05s	remaining: 134ms
968:	learn: 0.2364027	total: 4.06s	remaining: 130ms
969:	learn: 0.2363486	total: 4.06s	remaining: 126ms
970:	learn: 0.2362723	total: 4.06s	remaining: 121ms
971:	learn: 0.2362028	total: 4.07s	remaining: 117ms
972:	learn: 0.2361584	total: 4.07s	remaining: 113ms
973:	learn: 0.2360857	total: 4.08s	remaining: 109ms
974:	learn: 0.2360364	total: 4.08s	remaining: 105ms
975:	learn: 0.2359788	total: 4.08s	remaining: 100ms
976:	learn: 0.2359136	total: 4.09s	remaining: 96.2ms
977:	learn: 0.2358763	total: 4.09s	remaining: 92ms
978:	learn: 0.2358171	total: 4.1s	remaining: 87.9ms
979:	learn: 0.2357520	total: 4.1s	remaining: 83.7ms
980:	learn: 0.2356679	total: 4.1s	remaining: 79.5ms
981:	learn: 0.2356051	total: 4.11s	remaining: 75.3ms
982:	learn: 0.2355623	total: 4.11s	remaining: 71.1ms
983:	learn: 0.2355187	total: 4.12s	remaining: 66.9ms
984:	learn: 0.2354710	total: 4.12s	remaining: 62.7ms
985:	learn: 0.2353936	total: 4.12s	remaining: 58.5ms
986:	learn: 0.2353532	total: 4.13s	remaining: 54.4ms
987:	learn: 0.2352654	total: 4.13s	remaining: 50.2ms
988:	learn: 0.2351708	total: 4.14s	remaining: 46ms
989:	learn: 0.2351344	total: 4.14s	remaining: 41.8ms
990:	learn: 0.2350958	total: 4.14s	remaining: 37.6ms
991:	learn: 0.2350621	total: 4.15s	remaining: 33.4ms
992:	learn: 0.2350127	total: 4.15s	remaining: 29.3ms
993:	learn: 0.2349372	total: 4.15s	remaining: 25.1ms
994:	learn: 0.2348698	total: 4.16s	remaining: 20.9ms
995:	learn: 0.2348187	total: 4.16s	remaining: 16.7ms
996:	learn: 0.2347460	total: 4.17s	remaining: 12.5ms
997:	learn: 0.2347066	total: 4.17s	remaining: 8.36ms
998:	learn: 0.2346789	total: 4.17s	remaining: 4.18ms
999:	learn: 0.2346649	total: 4.18s	remaining: 0us
In [278]:
cat_finalscore
Out[278]:
0.8659069437883797
In [279]:
r={"Catboost":86.5906943}
Results.append(r)
In [280]:
cm = confusion_matrix(y_test, y_pred)
disp = ConfusionMatrixDisplay(confusion_matrix=cm)
disp.plot();
In [281]:
print(classification_report(y_test,y_pred))
              precision    recall  f1-score   support

           0       0.89      0.95      0.92     13173
           1       0.76      0.57      0.65      3763

    accuracy                           0.87     16936
   macro avg       0.83      0.76      0.79     16936
weighted avg       0.86      0.87      0.86     16936

In [282]:
Original_Data.info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 145460 entries, 0 to 145459
Data columns (total 25 columns):
 #   Column         Non-Null Count   Dtype         
---  ------         --------------   -----         
 0   Date           145460 non-null  datetime64[ns]
 1   Location       145460 non-null  object        
 2   MinTemp        143975 non-null  float64       
 3   MaxTemp        144199 non-null  float64       
 4   Rainfall       142199 non-null  float64       
 5   Evaporation    82670 non-null   float64       
 6   Sunshine       75625 non-null   float64       
 7   WindGustDir    135134 non-null  object        
 8   WindGustSpeed  135197 non-null  float64       
 9   WindDir9am     134894 non-null  object        
 10  WindDir3pm     141232 non-null  object        
 11  WindSpeed9am   143693 non-null  float64       
 12  WindSpeed3pm   142398 non-null  float64       
 13  Humidity9am    142806 non-null  float64       
 14  Humidity3pm    140953 non-null  float64       
 15  Pressure9am    130395 non-null  float64       
 16  Pressure3pm    130432 non-null  float64       
 17  Cloud9am       89572 non-null   float64       
 18  Cloud3pm       86102 non-null   float64       
 19  Temp9am        143693 non-null  float64       
 20  Temp3pm        141851 non-null  float64       
 21  RainToday      142199 non-null  object        
 22  RainTomorrow   142193 non-null  object        
 23  Year           145460 non-null  int64         
 24  Month          145460 non-null  int64         
dtypes: datetime64[ns](1), float64(16), int64(2), object(6)
memory usage: 27.7+ MB
In [283]:
cols = Original_Data .columns.tolist()
cols =  cols[:-2]
Original_Data  =Original_Data [cols]
Original_Data .info()
<class 'pandas.core.frame.DataFrame'>
RangeIndex: 145460 entries, 0 to 145459
Data columns (total 23 columns):
 #   Column         Non-Null Count   Dtype         
---  ------         --------------   -----         
 0   Date           145460 non-null  datetime64[ns]
 1   Location       145460 non-null  object        
 2   MinTemp        143975 non-null  float64       
 3   MaxTemp        144199 non-null  float64       
 4   Rainfall       142199 non-null  float64       
 5   Evaporation    82670 non-null   float64       
 6   Sunshine       75625 non-null   float64       
 7   WindGustDir    135134 non-null  object        
 8   WindGustSpeed  135197 non-null  float64       
 9   WindDir9am     134894 non-null  object        
 10  WindDir3pm     141232 non-null  object        
 11  WindSpeed9am   143693 non-null  float64       
 12  WindSpeed3pm   142398 non-null  float64       
 13  Humidity9am    142806 non-null  float64       
 14  Humidity3pm    140953 non-null  float64       
 15  Pressure9am    130395 non-null  float64       
 16  Pressure3pm    130432 non-null  float64       
 17  Cloud9am       89572 non-null   float64       
 18  Cloud3pm       86102 non-null   float64       
 19  Temp9am        143693 non-null  float64       
 20  Temp3pm        141851 non-null  float64       
 21  RainToday      142199 non-null  object        
 22  RainTomorrow   142193 non-null  object        
dtypes: datetime64[ns](1), float64(16), object(6)
memory usage: 25.5+ MB
In [284]:
Original_Data.head()
le = LabelEncoder()
Original_Data [cat_cols] =Original_Data[cat_cols].astype('str').apply(le.fit_transform)
In [285]:
X = Original_Data.iloc[:, :-1]
y =Original_Data.iloc[:, -1:]

X_train, X_test, y_train, y_test = train_test_split(X, y,
                                                   test_size=0.3, random_state = 1)
In [286]:
X_train.head()
Out[286]:
Date Location MinTemp MaxTemp Rainfall Evaporation Sunshine WindGustDir WindGustSpeed WindDir9am ... WindSpeed3pm Humidity9am Humidity3pm Pressure9am Pressure3pm Cloud9am Cloud3pm Temp9am Temp3pm RainToday
51128 2502 40 14.0 27.8 10.4 NaN NaN 1 35.0 3 ... 20.0 74.0 43.0 1007.8 1004.1 NaN NaN 19.7 27.1 1
51182 2556 40 13.7 23.8 0.0 NaN NaN 10 41.0 10 ... 19.0 65.0 35.0 1013.7 1011.4 NaN NaN 16.7 21.7 0
12937 1297 21 2.6 17.7 0.0 2.2 9.9 14 22.0 4 ... 9.0 77.0 44.0 1020.5 1017.0 1.0 1.0 9.5 17.2 0
103061 935 28 9.3 16.8 3.4 1.6 4.7 4 30.0 1 ... 13.0 93.0 83.0 1007.1 1004.0 7.0 7.0 11.6 15.2 1
25342 1611 30 9.6 18.5 0.2 NaN NaN 10 37.0 11 ... 15.0 82.0 62.0 NaN NaN NaN NaN 15.4 17.5 0

5 rows × 22 columns

In [287]:
#Catboost
from catboost import CatBoostClassifier, Pool
cat = CatBoostClassifier()
cat.fit(X_train, y_train)
y_pred = cat.predict(X_test)
Learning rate set to 0.100167
0:	learn: 0.9871116	total: 13.6ms	remaining: 13.6s
1:	learn: 0.9012665	total: 23.2ms	remaining: 11.6s
2:	learn: 0.8329625	total: 32.1ms	remaining: 10.7s
3:	learn: 0.7766651	total: 41.5ms	remaining: 10.3s
4:	learn: 0.7306136	total: 51ms	remaining: 10.2s
5:	learn: 0.6912887	total: 59.9ms	remaining: 9.92s
6:	learn: 0.6582144	total: 68.7ms	remaining: 9.74s
7:	learn: 0.6292021	total: 77.8ms	remaining: 9.64s
8:	learn: 0.6049600	total: 86.8ms	remaining: 9.56s
9:	learn: 0.5830623	total: 95.7ms	remaining: 9.47s
10:	learn: 0.5640241	total: 105ms	remaining: 9.46s
11:	learn: 0.5471388	total: 115ms	remaining: 9.45s
12:	learn: 0.5326310	total: 124ms	remaining: 9.43s
13:	learn: 0.5198865	total: 134ms	remaining: 9.41s
14:	learn: 0.5088604	total: 144ms	remaining: 9.46s
15:	learn: 0.4992145	total: 154ms	remaining: 9.45s
16:	learn: 0.4903840	total: 163ms	remaining: 9.4s
17:	learn: 0.4824681	total: 172ms	remaining: 9.38s
18:	learn: 0.4755920	total: 181ms	remaining: 9.34s
19:	learn: 0.4697985	total: 192ms	remaining: 9.39s
20:	learn: 0.4642776	total: 202ms	remaining: 9.4s
21:	learn: 0.4589622	total: 210ms	remaining: 9.35s
22:	learn: 0.4548681	total: 219ms	remaining: 9.31s
23:	learn: 0.4503970	total: 229ms	remaining: 9.31s
24:	learn: 0.4468350	total: 238ms	remaining: 9.27s
25:	learn: 0.4431954	total: 248ms	remaining: 9.28s
26:	learn: 0.4394956	total: 258ms	remaining: 9.28s
27:	learn: 0.4367539	total: 266ms	remaining: 9.25s
28:	learn: 0.4341421	total: 275ms	remaining: 9.22s
29:	learn: 0.4316738	total: 285ms	remaining: 9.23s
30:	learn: 0.4295046	total: 295ms	remaining: 9.21s
31:	learn: 0.4269777	total: 303ms	remaining: 9.16s
32:	learn: 0.4253421	total: 312ms	remaining: 9.15s
33:	learn: 0.4233965	total: 321ms	remaining: 9.11s
34:	learn: 0.4217221	total: 330ms	remaining: 9.1s
35:	learn: 0.4203056	total: 340ms	remaining: 9.1s
36:	learn: 0.4189434	total: 350ms	remaining: 9.12s
37:	learn: 0.4177436	total: 359ms	remaining: 9.1s
38:	learn: 0.4163604	total: 368ms	remaining: 9.06s
39:	learn: 0.4152941	total: 377ms	remaining: 9.05s
40:	learn: 0.4144878	total: 385ms	remaining: 9.01s
41:	learn: 0.4138245	total: 393ms	remaining: 8.97s
42:	learn: 0.4126920	total: 402ms	remaining: 8.94s
43:	learn: 0.4116848	total: 411ms	remaining: 8.92s
44:	learn: 0.4106078	total: 420ms	remaining: 8.91s
45:	learn: 0.4098763	total: 429ms	remaining: 8.89s
46:	learn: 0.4089824	total: 438ms	remaining: 8.87s
47:	learn: 0.4083725	total: 446ms	remaining: 8.84s
48:	learn: 0.4075131	total: 455ms	remaining: 8.84s
49:	learn: 0.4063119	total: 464ms	remaining: 8.82s
50:	learn: 0.4055121	total: 474ms	remaining: 8.81s
51:	learn: 0.4047044	total: 483ms	remaining: 8.8s
52:	learn: 0.4040289	total: 493ms	remaining: 8.81s
53:	learn: 0.4034535	total: 502ms	remaining: 8.79s
54:	learn: 0.4026985	total: 511ms	remaining: 8.78s
55:	learn: 0.4021327	total: 520ms	remaining: 8.76s
56:	learn: 0.4015337	total: 529ms	remaining: 8.75s
57:	learn: 0.4006971	total: 538ms	remaining: 8.73s
58:	learn: 0.4002323	total: 548ms	remaining: 8.73s
59:	learn: 0.3997019	total: 557ms	remaining: 8.73s
60:	learn: 0.3992729	total: 567ms	remaining: 8.73s
61:	learn: 0.3987927	total: 579ms	remaining: 8.75s
62:	learn: 0.3982979	total: 591ms	remaining: 8.79s
63:	learn: 0.3968729	total: 602ms	remaining: 8.8s
64:	learn: 0.3965457	total: 613ms	remaining: 8.81s
65:	learn: 0.3961745	total: 624ms	remaining: 8.83s
66:	learn: 0.3958532	total: 634ms	remaining: 8.83s
67:	learn: 0.3955224	total: 643ms	remaining: 8.82s
68:	learn: 0.3952487	total: 652ms	remaining: 8.8s
69:	learn: 0.3948223	total: 662ms	remaining: 8.8s
70:	learn: 0.3945311	total: 672ms	remaining: 8.8s
71:	learn: 0.3941427	total: 682ms	remaining: 8.79s
72:	learn: 0.3939282	total: 691ms	remaining: 8.77s
73:	learn: 0.3934118	total: 701ms	remaining: 8.77s
74:	learn: 0.3930195	total: 709ms	remaining: 8.74s
75:	learn: 0.3927521	total: 718ms	remaining: 8.73s
76:	learn: 0.3923428	total: 729ms	remaining: 8.73s
77:	learn: 0.3920577	total: 738ms	remaining: 8.72s
78:	learn: 0.3917762	total: 747ms	remaining: 8.7s
79:	learn: 0.3914349	total: 757ms	remaining: 8.7s
80:	learn: 0.3912144	total: 765ms	remaining: 8.68s
81:	learn: 0.3908217	total: 776ms	remaining: 8.69s
82:	learn: 0.3905011	total: 785ms	remaining: 8.67s
83:	learn: 0.3901798	total: 794ms	remaining: 8.66s
84:	learn: 0.3898679	total: 803ms	remaining: 8.65s
85:	learn: 0.3893643	total: 814ms	remaining: 8.65s
86:	learn: 0.3890730	total: 824ms	remaining: 8.65s
87:	learn: 0.3887356	total: 835ms	remaining: 8.65s
88:	learn: 0.3883275	total: 844ms	remaining: 8.64s
89:	learn: 0.3881335	total: 853ms	remaining: 8.62s
90:	learn: 0.3877404	total: 861ms	remaining: 8.6s
91:	learn: 0.3875549	total: 870ms	remaining: 8.59s
92:	learn: 0.3872400	total: 880ms	remaining: 8.58s
93:	learn: 0.3870157	total: 890ms	remaining: 8.58s
94:	learn: 0.3866436	total: 900ms	remaining: 8.57s
95:	learn: 0.3864316	total: 909ms	remaining: 8.55s
96:	learn: 0.3861659	total: 919ms	remaining: 8.55s
97:	learn: 0.3858859	total: 928ms	remaining: 8.54s
98:	learn: 0.3856009	total: 938ms	remaining: 8.54s
99:	learn: 0.3852042	total: 948ms	remaining: 8.53s
100:	learn: 0.3848699	total: 958ms	remaining: 8.52s
101:	learn: 0.3845349	total: 968ms	remaining: 8.52s
102:	learn: 0.3842029	total: 977ms	remaining: 8.51s
103:	learn: 0.3839762	total: 987ms	remaining: 8.5s
104:	learn: 0.3837893	total: 996ms	remaining: 8.49s
105:	learn: 0.3835345	total: 1.01s	remaining: 8.49s
106:	learn: 0.3833142	total: 1.02s	remaining: 8.48s
107:	learn: 0.3829935	total: 1.03s	remaining: 8.48s
108:	learn: 0.3826412	total: 1.04s	remaining: 8.47s
109:	learn: 0.3824647	total: 1.04s	remaining: 8.46s
110:	learn: 0.3822848	total: 1.05s	remaining: 8.45s
111:	learn: 0.3821289	total: 1.06s	remaining: 8.43s
112:	learn: 0.3818292	total: 1.07s	remaining: 8.43s
113:	learn: 0.3815265	total: 1.08s	remaining: 8.44s
114:	learn: 0.3813547	total: 1.1s	remaining: 8.44s
115:	learn: 0.3810877	total: 1.11s	remaining: 8.45s
116:	learn: 0.3807205	total: 1.12s	remaining: 8.47s
117:	learn: 0.3804952	total: 1.13s	remaining: 8.47s
118:	learn: 0.3802944	total: 1.15s	remaining: 8.48s
119:	learn: 0.3800132	total: 1.16s	remaining: 8.49s
120:	learn: 0.3797943	total: 1.17s	remaining: 8.49s
121:	learn: 0.3795677	total: 1.18s	remaining: 8.48s
122:	learn: 0.3792664	total: 1.19s	remaining: 8.48s
123:	learn: 0.3791137	total: 1.2s	remaining: 8.48s
124:	learn: 0.3788939	total: 1.21s	remaining: 8.47s
125:	learn: 0.3786706	total: 1.22s	remaining: 8.47s
126:	learn: 0.3783814	total: 1.23s	remaining: 8.46s
127:	learn: 0.3781152	total: 1.24s	remaining: 8.45s
128:	learn: 0.3778537	total: 1.25s	remaining: 8.44s
129:	learn: 0.3776595	total: 1.26s	remaining: 8.44s
130:	learn: 0.3775450	total: 1.27s	remaining: 8.43s
131:	learn: 0.3772661	total: 1.28s	remaining: 8.42s
132:	learn: 0.3770392	total: 1.29s	remaining: 8.41s
133:	learn: 0.3769117	total: 1.3s	remaining: 8.39s
134:	learn: 0.3766706	total: 1.31s	remaining: 8.38s
135:	learn: 0.3765441	total: 1.32s	remaining: 8.36s
136:	learn: 0.3763692	total: 1.32s	remaining: 8.35s
137:	learn: 0.3761145	total: 1.33s	remaining: 8.34s
138:	learn: 0.3758335	total: 1.34s	remaining: 8.33s
139:	learn: 0.3755773	total: 1.35s	remaining: 8.32s
140:	learn: 0.3753500	total: 1.36s	remaining: 8.31s
141:	learn: 0.3751867	total: 1.37s	remaining: 8.29s
142:	learn: 0.3750072	total: 1.38s	remaining: 8.28s
143:	learn: 0.3748250	total: 1.39s	remaining: 8.27s
144:	learn: 0.3745886	total: 1.4s	remaining: 8.26s
145:	learn: 0.3744019	total: 1.41s	remaining: 8.24s
146:	learn: 0.3742349	total: 1.42s	remaining: 8.24s
147:	learn: 0.3740933	total: 1.43s	remaining: 8.23s
148:	learn: 0.3737080	total: 1.44s	remaining: 8.23s
149:	learn: 0.3735023	total: 1.45s	remaining: 8.22s
150:	learn: 0.3733261	total: 1.46s	remaining: 8.21s
151:	learn: 0.3731771	total: 1.47s	remaining: 8.2s
152:	learn: 0.3730342	total: 1.48s	remaining: 8.19s
153:	learn: 0.3728748	total: 1.49s	remaining: 8.18s
154:	learn: 0.3726281	total: 1.5s	remaining: 8.17s
155:	learn: 0.3725349	total: 1.51s	remaining: 8.15s
156:	learn: 0.3723351	total: 1.52s	remaining: 8.14s
157:	learn: 0.3719873	total: 1.53s	remaining: 8.14s
158:	learn: 0.3715016	total: 1.54s	remaining: 8.13s
159:	learn: 0.3713499	total: 1.55s	remaining: 8.12s
160:	learn: 0.3712222	total: 1.55s	remaining: 8.11s
161:	learn: 0.3711066	total: 1.56s	remaining: 8.09s
162:	learn: 0.3708879	total: 1.57s	remaining: 8.09s
163:	learn: 0.3706967	total: 1.58s	remaining: 8.08s
164:	learn: 0.3705621	total: 1.59s	remaining: 8.07s
165:	learn: 0.3703367	total: 1.6s	remaining: 8.06s
166:	learn: 0.3701715	total: 1.61s	remaining: 8.05s
167:	learn: 0.3699529	total: 1.63s	remaining: 8.05s
168:	learn: 0.3698334	total: 1.64s	remaining: 8.04s
169:	learn: 0.3696653	total: 1.65s	remaining: 8.04s
170:	learn: 0.3694490	total: 1.66s	remaining: 8.03s
171:	learn: 0.3692606	total: 1.67s	remaining: 8.03s
172:	learn: 0.3690466	total: 1.68s	remaining: 8.02s
173:	learn: 0.3689382	total: 1.69s	remaining: 8.01s
174:	learn: 0.3687501	total: 1.7s	remaining: 8s
175:	learn: 0.3683563	total: 1.71s	remaining: 8s
176:	learn: 0.3681300	total: 1.72s	remaining: 7.99s
177:	learn: 0.3679491	total: 1.73s	remaining: 7.99s
178:	learn: 0.3677513	total: 1.74s	remaining: 7.99s
179:	learn: 0.3675781	total: 1.75s	remaining: 7.98s
180:	learn: 0.3674499	total: 1.76s	remaining: 7.96s
181:	learn: 0.3672899	total: 1.77s	remaining: 7.97s
182:	learn: 0.3671932	total: 1.78s	remaining: 7.96s
183:	learn: 0.3669374	total: 1.79s	remaining: 7.95s
184:	learn: 0.3668554	total: 1.8s	remaining: 7.93s
185:	learn: 0.3666371	total: 1.81s	remaining: 7.93s
186:	learn: 0.3665493	total: 1.82s	remaining: 7.92s
187:	learn: 0.3663410	total: 1.83s	remaining: 7.91s
188:	learn: 0.3662104	total: 1.84s	remaining: 7.89s
189:	learn: 0.3661031	total: 1.85s	remaining: 7.88s
190:	learn: 0.3658434	total: 1.86s	remaining: 7.88s
191:	learn: 0.3656966	total: 1.87s	remaining: 7.87s
192:	learn: 0.3655479	total: 1.88s	remaining: 7.86s
193:	learn: 0.3654126	total: 1.89s	remaining: 7.85s
194:	learn: 0.3652497	total: 1.9s	remaining: 7.84s
195:	learn: 0.3651451	total: 1.91s	remaining: 7.83s
196:	learn: 0.3649587	total: 1.92s	remaining: 7.82s
197:	learn: 0.3648182	total: 1.93s	remaining: 7.81s
198:	learn: 0.3647214	total: 1.94s	remaining: 7.8s
199:	learn: 0.3645111	total: 1.95s	remaining: 7.79s
200:	learn: 0.3642451	total: 1.96s	remaining: 7.78s
201:	learn: 0.3641054	total: 1.97s	remaining: 7.77s
202:	learn: 0.3640365	total: 1.98s	remaining: 7.76s
203:	learn: 0.3637553	total: 1.99s	remaining: 7.75s
204:	learn: 0.3636401	total: 1.99s	remaining: 7.73s
205:	learn: 0.3635216	total: 2s	remaining: 7.72s
206:	learn: 0.3633455	total: 2.01s	remaining: 7.71s
207:	learn: 0.3630823	total: 2.02s	remaining: 7.7s
208:	learn: 0.3629032	total: 2.03s	remaining: 7.7s
209:	learn: 0.3627972	total: 2.05s	remaining: 7.72s
210:	learn: 0.3627235	total: 2.06s	remaining: 7.72s
211:	learn: 0.3626059	total: 2.08s	remaining: 7.73s
212:	learn: 0.3625398	total: 2.09s	remaining: 7.72s
213:	learn: 0.3624061	total: 2.1s	remaining: 7.71s
214:	learn: 0.3621927	total: 2.11s	remaining: 7.71s
215:	learn: 0.3620923	total: 2.12s	remaining: 7.7s
216:	learn: 0.3617176	total: 2.13s	remaining: 7.69s
217:	learn: 0.3615392	total: 2.14s	remaining: 7.68s
218:	learn: 0.3614281	total: 2.15s	remaining: 7.67s
219:	learn: 0.3613792	total: 2.16s	remaining: 7.66s
220:	learn: 0.3611845	total: 2.17s	remaining: 7.65s
221:	learn: 0.3610332	total: 2.18s	remaining: 7.63s
222:	learn: 0.3608733	total: 2.19s	remaining: 7.62s
223:	learn: 0.3607885	total: 2.2s	remaining: 7.61s
224:	learn: 0.3606332	total: 2.21s	remaining: 7.6s
225:	learn: 0.3605174	total: 2.22s	remaining: 7.59s
226:	learn: 0.3603217	total: 2.23s	remaining: 7.58s
227:	learn: 0.3600618	total: 2.24s	remaining: 7.58s
228:	learn: 0.3599362	total: 2.25s	remaining: 7.57s
229:	learn: 0.3597434	total: 2.26s	remaining: 7.56s
230:	learn: 0.3594836	total: 2.27s	remaining: 7.55s
231:	learn: 0.3593486	total: 2.28s	remaining: 7.55s
232:	learn: 0.3591729	total: 2.29s	remaining: 7.54s
233:	learn: 0.3590419	total: 2.3s	remaining: 7.53s
234:	learn: 0.3589158	total: 2.31s	remaining: 7.52s
235:	learn: 0.3587452	total: 2.32s	remaining: 7.51s
236:	learn: 0.3586518	total: 2.33s	remaining: 7.5s
237:	learn: 0.3585726	total: 2.34s	remaining: 7.49s
238:	learn: 0.3584806	total: 2.35s	remaining: 7.48s
239:	learn: 0.3584054	total: 2.36s	remaining: 7.46s
240:	learn: 0.3582331	total: 2.37s	remaining: 7.46s
241:	learn: 0.3581284	total: 2.38s	remaining: 7.45s
242:	learn: 0.3580566	total: 2.39s	remaining: 7.44s
243:	learn: 0.3578782	total: 2.4s	remaining: 7.43s
244:	learn: 0.3577168	total: 2.41s	remaining: 7.43s
245:	learn: 0.3575303	total: 2.42s	remaining: 7.42s
246:	learn: 0.3574068	total: 2.43s	remaining: 7.42s
247:	learn: 0.3573122	total: 2.44s	remaining: 7.4s
248:	learn: 0.3572094	total: 2.45s	remaining: 7.39s
249:	learn: 0.3571056	total: 2.46s	remaining: 7.38s
250:	learn: 0.3569417	total: 2.47s	remaining: 7.38s
251:	learn: 0.3568275	total: 2.48s	remaining: 7.37s
252:	learn: 0.3566748	total: 2.49s	remaining: 7.36s
253:	learn: 0.3565373	total: 2.5s	remaining: 7.35s
254:	learn: 0.3564137	total: 2.51s	remaining: 7.34s
255:	learn: 0.3563258	total: 2.52s	remaining: 7.33s
256:	learn: 0.3560763	total: 2.53s	remaining: 7.32s
257:	learn: 0.3559257	total: 2.54s	remaining: 7.32s
258:	learn: 0.3557874	total: 2.55s	remaining: 7.31s
259:	learn: 0.3555632	total: 2.56s	remaining: 7.3s
260:	learn: 0.3554529	total: 2.58s	remaining: 7.29s
261:	learn: 0.3553716	total: 2.58s	remaining: 7.28s
262:	learn: 0.3551679	total: 2.59s	remaining: 7.27s
263:	learn: 0.3550106	total: 2.6s	remaining: 7.26s
264:	learn: 0.3548490	total: 2.61s	remaining: 7.25s
265:	learn: 0.3547441	total: 2.62s	remaining: 7.24s
266:	learn: 0.3546555	total: 2.63s	remaining: 7.23s
267:	learn: 0.3544961	total: 2.64s	remaining: 7.22s
268:	learn: 0.3542635	total: 2.65s	remaining: 7.21s
269:	learn: 0.3541945	total: 2.66s	remaining: 7.2s
270:	learn: 0.3540753	total: 2.67s	remaining: 7.2s
271:	learn: 0.3539553	total: 2.68s	remaining: 7.18s
272:	learn: 0.3538242	total: 2.69s	remaining: 7.18s
273:	learn: 0.3536873	total: 2.71s	remaining: 7.17s
274:	learn: 0.3535869	total: 2.72s	remaining: 7.16s
275:	learn: 0.3534530	total: 2.73s	remaining: 7.15s
276:	learn: 0.3533742	total: 2.73s	remaining: 7.14s
277:	learn: 0.3532778	total: 2.74s	remaining: 7.13s
278:	learn: 0.3531868	total: 2.75s	remaining: 7.12s
279:	learn: 0.3530629	total: 2.76s	remaining: 7.11s
280:	learn: 0.3529438	total: 2.77s	remaining: 7.1s
281:	learn: 0.3528052	total: 2.78s	remaining: 7.09s
282:	learn: 0.3526942	total: 2.79s	remaining: 7.08s
283:	learn: 0.3525293	total: 2.81s	remaining: 7.07s
284:	learn: 0.3524412	total: 2.81s	remaining: 7.06s
285:	learn: 0.3523118	total: 2.82s	remaining: 7.05s
286:	learn: 0.3522267	total: 2.83s	remaining: 7.04s
287:	learn: 0.3520789	total: 2.84s	remaining: 7.03s
288:	learn: 0.3520087	total: 2.85s	remaining: 7.02s
289:	learn: 0.3518968	total: 2.86s	remaining: 7.01s
290:	learn: 0.3518016	total: 2.87s	remaining: 7s
291:	learn: 0.3516721	total: 2.88s	remaining: 7s
292:	learn: 0.3515482	total: 2.9s	remaining: 6.99s
293:	learn: 0.3514569	total: 2.91s	remaining: 6.98s
294:	learn: 0.3513160	total: 2.92s	remaining: 6.97s
295:	learn: 0.3511797	total: 2.93s	remaining: 6.96s
296:	learn: 0.3510846	total: 2.94s	remaining: 6.95s
297:	learn: 0.3509640	total: 2.95s	remaining: 6.94s
298:	learn: 0.3508038	total: 2.96s	remaining: 6.93s
299:	learn: 0.3506807	total: 2.97s	remaining: 6.92s
300:	learn: 0.3505727	total: 2.98s	remaining: 6.91s
301:	learn: 0.3504335	total: 2.98s	remaining: 6.9s
302:	learn: 0.3503165	total: 3s	remaining: 6.89s
303:	learn: 0.3502419	total: 3s	remaining: 6.88s
304:	learn: 0.3501134	total: 3.01s	remaining: 6.87s
305:	learn: 0.3500176	total: 3.02s	remaining: 6.86s
306:	learn: 0.3499004	total: 3.04s	remaining: 6.85s
307:	learn: 0.3498139	total: 3.04s	remaining: 6.84s
308:	learn: 0.3497389	total: 3.05s	remaining: 6.83s
309:	learn: 0.3496735	total: 3.06s	remaining: 6.82s
310:	learn: 0.3495856	total: 3.07s	remaining: 6.81s
311:	learn: 0.3495090	total: 3.08s	remaining: 6.8s
312:	learn: 0.3494477	total: 3.09s	remaining: 6.79s
313:	learn: 0.3493480	total: 3.1s	remaining: 6.78s
314:	learn: 0.3492120	total: 3.11s	remaining: 6.77s
315:	learn: 0.3490856	total: 3.12s	remaining: 6.76s
316:	learn: 0.3489619	total: 3.13s	remaining: 6.75s
317:	learn: 0.3487976	total: 3.14s	remaining: 6.74s
318:	learn: 0.3486304	total: 3.15s	remaining: 6.73s
319:	learn: 0.3485384	total: 3.16s	remaining: 6.72s
320:	learn: 0.3484679	total: 3.17s	remaining: 6.71s
321:	learn: 0.3483395	total: 3.18s	remaining: 6.7s
322:	learn: 0.3481477	total: 3.19s	remaining: 6.69s
323:	learn: 0.3480412	total: 3.2s	remaining: 6.68s
324:	learn: 0.3479158	total: 3.22s	remaining: 6.7s
325:	learn: 0.3478461	total: 3.25s	remaining: 6.72s
326:	learn: 0.3477175	total: 3.26s	remaining: 6.71s
327:	learn: 0.3476634	total: 3.27s	remaining: 6.7s
328:	learn: 0.3475367	total: 3.28s	remaining: 6.69s
329:	learn: 0.3474778	total: 3.29s	remaining: 6.68s
330:	learn: 0.3474163	total: 3.3s	remaining: 6.67s
331:	learn: 0.3472988	total: 3.31s	remaining: 6.66s
332:	learn: 0.3472604	total: 3.32s	remaining: 6.64s
333:	learn: 0.3471663	total: 3.33s	remaining: 6.63s
334:	learn: 0.3470538	total: 3.34s	remaining: 6.62s
335:	learn: 0.3468305	total: 3.35s	remaining: 6.61s
336:	learn: 0.3466843	total: 3.36s	remaining: 6.6s
337:	learn: 0.3466507	total: 3.37s	remaining: 6.59s
338:	learn: 0.3465916	total: 3.37s	remaining: 6.58s
339:	learn: 0.3465327	total: 3.38s	remaining: 6.57s
340:	learn: 0.3464118	total: 3.4s	remaining: 6.56s
341:	learn: 0.3463033	total: 3.4s	remaining: 6.55s
342:	learn: 0.3462333	total: 3.41s	remaining: 6.54s
343:	learn: 0.3461245	total: 3.42s	remaining: 6.53s
344:	learn: 0.3460832	total: 3.43s	remaining: 6.52s
345:	learn: 0.3460120	total: 3.44s	remaining: 6.51s
346:	learn: 0.3458730	total: 3.45s	remaining: 6.5s
347:	learn: 0.3457928	total: 3.46s	remaining: 6.49s
348:	learn: 0.3457006	total: 3.47s	remaining: 6.48s
349:	learn: 0.3456096	total: 3.48s	remaining: 6.47s
350:	learn: 0.3455192	total: 3.49s	remaining: 6.46s
351:	learn: 0.3453873	total: 3.5s	remaining: 6.45s
352:	learn: 0.3451990	total: 3.51s	remaining: 6.44s
353:	learn: 0.3451264	total: 3.52s	remaining: 6.43s
354:	learn: 0.3450144	total: 3.53s	remaining: 6.42s
355:	learn: 0.3449577	total: 3.54s	remaining: 6.41s
356:	learn: 0.3447290	total: 3.55s	remaining: 6.4s
357:	learn: 0.3446735	total: 3.56s	remaining: 6.39s
358:	learn: 0.3445235	total: 3.57s	remaining: 6.38s
359:	learn: 0.3444384	total: 3.58s	remaining: 6.37s
360:	learn: 0.3443767	total: 3.59s	remaining: 6.36s
361:	learn: 0.3443044	total: 3.6s	remaining: 6.35s
362:	learn: 0.3442470	total: 3.61s	remaining: 6.33s
363:	learn: 0.3441816	total: 3.62s	remaining: 6.32s
364:	learn: 0.3441216	total: 3.63s	remaining: 6.31s
365:	learn: 0.3439966	total: 3.64s	remaining: 6.3s
366:	learn: 0.3438955	total: 3.65s	remaining: 6.29s
367:	learn: 0.3437943	total: 3.66s	remaining: 6.28s
368:	learn: 0.3436987	total: 3.67s	remaining: 6.28s
369:	learn: 0.3436560	total: 3.68s	remaining: 6.26s
370:	learn: 0.3436065	total: 3.69s	remaining: 6.25s
371:	learn: 0.3435266	total: 3.7s	remaining: 6.24s
372:	learn: 0.3434521	total: 3.71s	remaining: 6.23s
373:	learn: 0.3433834	total: 3.72s	remaining: 6.22s
374:	learn: 0.3432777	total: 3.73s	remaining: 6.21s
375:	learn: 0.3431544	total: 3.74s	remaining: 6.2s
376:	learn: 0.3430279	total: 3.75s	remaining: 6.19s
377:	learn: 0.3428886	total: 3.76s	remaining: 6.18s
378:	learn: 0.3428171	total: 3.77s	remaining: 6.17s
379:	learn: 0.3427037	total: 3.78s	remaining: 6.16s
380:	learn: 0.3425685	total: 3.79s	remaining: 6.15s
381:	learn: 0.3424975	total: 3.8s	remaining: 6.14s
382:	learn: 0.3423884	total: 3.81s	remaining: 6.13s
383:	learn: 0.3422697	total: 3.82s	remaining: 6.12s
384:	learn: 0.3421140	total: 3.83s	remaining: 6.11s
385:	learn: 0.3419888	total: 3.84s	remaining: 6.11s
386:	learn: 0.3418846	total: 3.85s	remaining: 6.09s
387:	learn: 0.3418321	total: 3.86s	remaining: 6.09s
388:	learn: 0.3417742	total: 3.87s	remaining: 6.08s
389:	learn: 0.3416652	total: 3.88s	remaining: 6.07s
390:	learn: 0.3415977	total: 3.89s	remaining: 6.05s
391:	learn: 0.3415334	total: 3.9s	remaining: 6.05s
392:	learn: 0.3414576	total: 3.91s	remaining: 6.04s
393:	learn: 0.3414233	total: 3.92s	remaining: 6.03s
394:	learn: 0.3413408	total: 3.93s	remaining: 6.02s
395:	learn: 0.3412038	total: 3.94s	remaining: 6s
396:	learn: 0.3411092	total: 3.95s	remaining: 6s
397:	learn: 0.3410295	total: 3.96s	remaining: 5.98s
398:	learn: 0.3409581	total: 3.96s	remaining: 5.97s
399:	learn: 0.3408782	total: 3.98s	remaining: 5.96s
400:	learn: 0.3407761	total: 3.98s	remaining: 5.95s
401:	learn: 0.3406661	total: 4s	remaining: 5.94s
402:	learn: 0.3405973	total: 4s	remaining: 5.93s
403:	learn: 0.3405195	total: 4.01s	remaining: 5.92s
404:	learn: 0.3403603	total: 4.03s	remaining: 5.92s
405:	learn: 0.3402858	total: 4.04s	remaining: 5.91s
406:	learn: 0.3402033	total: 4.05s	remaining: 5.9s
407:	learn: 0.3401541	total: 4.06s	remaining: 5.89s
408:	learn: 0.3400794	total: 4.07s	remaining: 5.88s
409:	learn: 0.3400354	total: 4.08s	remaining: 5.87s
410:	learn: 0.3399754	total: 4.08s	remaining: 5.86s
411:	learn: 0.3398636	total: 4.1s	remaining: 5.85s
412:	learn: 0.3398026	total: 4.11s	remaining: 5.84s
413:	learn: 0.3397254	total: 4.12s	remaining: 5.83s
414:	learn: 0.3395717	total: 4.13s	remaining: 5.82s
415:	learn: 0.3394965	total: 4.14s	remaining: 5.81s
416:	learn: 0.3394164	total: 4.15s	remaining: 5.8s
417:	learn: 0.3393284	total: 4.16s	remaining: 5.79s
418:	learn: 0.3392473	total: 4.17s	remaining: 5.78s
419:	learn: 0.3391697	total: 4.18s	remaining: 5.77s
420:	learn: 0.3390419	total: 4.19s	remaining: 5.76s
421:	learn: 0.3389457	total: 4.2s	remaining: 5.75s
422:	learn: 0.3388513	total: 4.21s	remaining: 5.74s
423:	learn: 0.3387568	total: 4.22s	remaining: 5.73s
424:	learn: 0.3386365	total: 4.23s	remaining: 5.72s
425:	learn: 0.3385873	total: 4.24s	remaining: 5.71s
426:	learn: 0.3385477	total: 4.25s	remaining: 5.7s
427:	learn: 0.3384731	total: 4.26s	remaining: 5.69s
428:	learn: 0.3383881	total: 4.27s	remaining: 5.68s
429:	learn: 0.3383303	total: 4.28s	remaining: 5.67s
430:	learn: 0.3382859	total: 4.29s	remaining: 5.66s
431:	learn: 0.3382177	total: 4.29s	remaining: 5.65s
432:	learn: 0.3381892	total: 4.3s	remaining: 5.64s
433:	learn: 0.3381388	total: 4.31s	remaining: 5.62s
434:	learn: 0.3380567	total: 4.32s	remaining: 5.62s
435:	learn: 0.3379996	total: 4.33s	remaining: 5.6s
436:	learn: 0.3379643	total: 4.34s	remaining: 5.59s
437:	learn: 0.3378553	total: 4.35s	remaining: 5.58s
438:	learn: 0.3377847	total: 4.36s	remaining: 5.57s
439:	learn: 0.3377234	total: 4.37s	remaining: 5.56s
440:	learn: 0.3376508	total: 4.38s	remaining: 5.55s
441:	learn: 0.3375803	total: 4.39s	remaining: 5.54s
442:	learn: 0.3375091	total: 4.4s	remaining: 5.53s
443:	learn: 0.3374074	total: 4.41s	remaining: 5.52s
444:	learn: 0.3373026	total: 4.42s	remaining: 5.51s
445:	learn: 0.3372279	total: 4.43s	remaining: 5.5s
446:	learn: 0.3371417	total: 4.44s	remaining: 5.5s
447:	learn: 0.3370552	total: 4.45s	remaining: 5.49s
448:	learn: 0.3369435	total: 4.46s	remaining: 5.48s
449:	learn: 0.3368296	total: 4.47s	remaining: 5.47s
450:	learn: 0.3367862	total: 4.48s	remaining: 5.46s
451:	learn: 0.3367058	total: 4.49s	remaining: 5.45s
452:	learn: 0.3366319	total: 4.5s	remaining: 5.44s
453:	learn: 0.3365729	total: 4.51s	remaining: 5.43s
454:	learn: 0.3365102	total: 4.52s	remaining: 5.42s
455:	learn: 0.3363901	total: 4.53s	remaining: 5.41s
456:	learn: 0.3363174	total: 4.54s	remaining: 5.4s
457:	learn: 0.3361555	total: 4.55s	remaining: 5.39s
458:	learn: 0.3360807	total: 4.57s	remaining: 5.38s
459:	learn: 0.3360285	total: 4.58s	remaining: 5.37s
460:	learn: 0.3359574	total: 4.58s	remaining: 5.36s
461:	learn: 0.3358890	total: 4.59s	remaining: 5.35s
462:	learn: 0.3357927	total: 4.6s	remaining: 5.34s
463:	learn: 0.3357076	total: 4.61s	remaining: 5.33s
464:	learn: 0.3356423	total: 4.62s	remaining: 5.32s
465:	learn: 0.3356132	total: 4.63s	remaining: 5.31s
466:	learn: 0.3355637	total: 4.64s	remaining: 5.3s
467:	learn: 0.3355049	total: 4.65s	remaining: 5.29s
468:	learn: 0.3353817	total: 4.66s	remaining: 5.28s
469:	learn: 0.3352863	total: 4.67s	remaining: 5.27s
470:	learn: 0.3351632	total: 4.68s	remaining: 5.26s
471:	learn: 0.3350828	total: 4.69s	remaining: 5.25s
472:	learn: 0.3349935	total: 4.7s	remaining: 5.24s
473:	learn: 0.3349319	total: 4.71s	remaining: 5.23s
474:	learn: 0.3348572	total: 4.72s	remaining: 5.22s
475:	learn: 0.3347349	total: 4.73s	remaining: 5.21s
476:	learn: 0.3346565	total: 4.74s	remaining: 5.2s
477:	learn: 0.3345496	total: 4.75s	remaining: 5.19s
478:	learn: 0.3343677	total: 4.76s	remaining: 5.18s
479:	learn: 0.3342745	total: 4.77s	remaining: 5.17s
480:	learn: 0.3342344	total: 4.78s	remaining: 5.16s
481:	learn: 0.3341523	total: 4.79s	remaining: 5.15s
482:	learn: 0.3341018	total: 4.8s	remaining: 5.14s
483:	learn: 0.3340125	total: 4.81s	remaining: 5.13s
484:	learn: 0.3338945	total: 4.82s	remaining: 5.12s
485:	learn: 0.3338394	total: 4.83s	remaining: 5.11s
486:	learn: 0.3337691	total: 4.84s	remaining: 5.1s
487:	learn: 0.3337134	total: 4.85s	remaining: 5.09s
488:	learn: 0.3336054	total: 4.86s	remaining: 5.08s
489:	learn: 0.3335386	total: 4.87s	remaining: 5.07s
490:	learn: 0.3334954	total: 4.88s	remaining: 5.06s
491:	learn: 0.3334531	total: 4.89s	remaining: 5.05s
492:	learn: 0.3333840	total: 4.9s	remaining: 5.04s
493:	learn: 0.3333287	total: 4.91s	remaining: 5.03s
494:	learn: 0.3332417	total: 4.92s	remaining: 5.02s
495:	learn: 0.3331874	total: 4.93s	remaining: 5.01s
496:	learn: 0.3331242	total: 4.94s	remaining: 5s
497:	learn: 0.3330657	total: 4.95s	remaining: 4.99s
498:	learn: 0.3329772	total: 4.96s	remaining: 4.98s
499:	learn: 0.3328826	total: 4.97s	remaining: 4.97s
500:	learn: 0.3328036	total: 4.98s	remaining: 4.96s
501:	learn: 0.3327389	total: 4.99s	remaining: 4.95s
502:	learn: 0.3327098	total: 5s	remaining: 4.94s
503:	learn: 0.3326597	total: 5.01s	remaining: 4.93s
504:	learn: 0.3325984	total: 5.02s	remaining: 4.92s
505:	learn: 0.3325235	total: 5.03s	remaining: 4.91s
506:	learn: 0.3324344	total: 5.04s	remaining: 4.9s
507:	learn: 0.3323590	total: 5.05s	remaining: 4.89s
508:	learn: 0.3323186	total: 5.06s	remaining: 4.88s
509:	learn: 0.3322161	total: 5.07s	remaining: 4.88s
510:	learn: 0.3321291	total: 5.08s	remaining: 4.87s
511:	learn: 0.3320070	total: 5.09s	remaining: 4.86s
512:	learn: 0.3318841	total: 5.11s	remaining: 4.85s
513:	learn: 0.3317906	total: 5.12s	remaining: 4.84s
514:	learn: 0.3317626	total: 5.13s	remaining: 4.83s
515:	learn: 0.3317109	total: 5.14s	remaining: 4.82s
516:	learn: 0.3316119	total: 5.15s	remaining: 4.81s
517:	learn: 0.3315420	total: 5.16s	remaining: 4.8s
518:	learn: 0.3314559	total: 5.17s	remaining: 4.79s
519:	learn: 0.3314027	total: 5.18s	remaining: 4.78s
520:	learn: 0.3313419	total: 5.19s	remaining: 4.77s
521:	learn: 0.3312853	total: 5.2s	remaining: 4.76s
522:	learn: 0.3312112	total: 5.21s	remaining: 4.75s
523:	learn: 0.3311699	total: 5.22s	remaining: 4.74s
524:	learn: 0.3311176	total: 5.23s	remaining: 4.73s
525:	learn: 0.3309998	total: 5.24s	remaining: 4.72s
526:	learn: 0.3309651	total: 5.25s	remaining: 4.71s
527:	learn: 0.3308890	total: 5.26s	remaining: 4.7s
528:	learn: 0.3308194	total: 5.27s	remaining: 4.69s
529:	learn: 0.3307393	total: 5.28s	remaining: 4.69s
530:	learn: 0.3306639	total: 5.29s	remaining: 4.68s
531:	learn: 0.3305918	total: 5.3s	remaining: 4.67s
532:	learn: 0.3305419	total: 5.32s	remaining: 4.66s
533:	learn: 0.3304704	total: 5.33s	remaining: 4.65s
534:	learn: 0.3304102	total: 5.33s	remaining: 4.64s
535:	learn: 0.3303305	total: 5.35s	remaining: 4.63s
536:	learn: 0.3302132	total: 5.36s	remaining: 4.62s
537:	learn: 0.3301295	total: 5.37s	remaining: 4.61s
538:	learn: 0.3300591	total: 5.38s	remaining: 4.6s
539:	learn: 0.3299648	total: 5.39s	remaining: 4.59s
540:	learn: 0.3298993	total: 5.4s	remaining: 4.58s
541:	learn: 0.3298551	total: 5.41s	remaining: 4.58s
542:	learn: 0.3297825	total: 5.42s	remaining: 4.57s
543:	learn: 0.3297381	total: 5.43s	remaining: 4.56s
544:	learn: 0.3296945	total: 5.45s	remaining: 4.55s
545:	learn: 0.3296059	total: 5.46s	remaining: 4.54s
546:	learn: 0.3295568	total: 5.46s	remaining: 4.53s
547:	learn: 0.3295169	total: 5.47s	remaining: 4.52s
548:	learn: 0.3294760	total: 5.49s	remaining: 4.51s
549:	learn: 0.3294293	total: 5.5s	remaining: 4.5s
550:	learn: 0.3294006	total: 5.5s	remaining: 4.49s
551:	learn: 0.3293280	total: 5.52s	remaining: 4.48s
552:	learn: 0.3292092	total: 5.53s	remaining: 4.47s
553:	learn: 0.3291570	total: 5.54s	remaining: 4.46s
554:	learn: 0.3291165	total: 5.55s	remaining: 4.45s
555:	learn: 0.3290520	total: 5.56s	remaining: 4.44s
556:	learn: 0.3289319	total: 5.57s	remaining: 4.43s
557:	learn: 0.3288517	total: 5.58s	remaining: 4.42s
558:	learn: 0.3287983	total: 5.59s	remaining: 4.41s
559:	learn: 0.3287365	total: 5.6s	remaining: 4.4s
560:	learn: 0.3286572	total: 5.61s	remaining: 4.39s
561:	learn: 0.3285984	total: 5.62s	remaining: 4.38s
562:	learn: 0.3285174	total: 5.63s	remaining: 4.37s
563:	learn: 0.3284051	total: 5.64s	remaining: 4.36s
564:	learn: 0.3283474	total: 5.65s	remaining: 4.35s
565:	learn: 0.3282701	total: 5.66s	remaining: 4.34s
566:	learn: 0.3281742	total: 5.67s	remaining: 4.33s
567:	learn: 0.3280731	total: 5.68s	remaining: 4.32s
568:	learn: 0.3279683	total: 5.69s	remaining: 4.31s
569:	learn: 0.3278831	total: 5.7s	remaining: 4.3s
570:	learn: 0.3278334	total: 5.71s	remaining: 4.29s
571:	learn: 0.3277330	total: 5.72s	remaining: 4.28s
572:	learn: 0.3276666	total: 5.73s	remaining: 4.27s
573:	learn: 0.3275521	total: 5.74s	remaining: 4.26s
574:	learn: 0.3275290	total: 5.75s	remaining: 4.25s
575:	learn: 0.3274340	total: 5.76s	remaining: 4.24s
576:	learn: 0.3273935	total: 5.77s	remaining: 4.23s
577:	learn: 0.3273401	total: 5.78s	remaining: 4.22s
578:	learn: 0.3272698	total: 5.79s	remaining: 4.21s
579:	learn: 0.3271614	total: 5.8s	remaining: 4.2s
580:	learn: 0.3270908	total: 5.81s	remaining: 4.19s
581:	learn: 0.3270151	total: 5.82s	remaining: 4.18s
582:	learn: 0.3269404	total: 5.83s	remaining: 4.17s
583:	learn: 0.3268801	total: 5.84s	remaining: 4.16s
584:	learn: 0.3267973	total: 5.86s	remaining: 4.15s
585:	learn: 0.3267432	total: 5.86s	remaining: 4.14s
586:	learn: 0.3266530	total: 5.88s	remaining: 4.13s
587:	learn: 0.3265431	total: 5.89s	remaining: 4.12s
588:	learn: 0.3264980	total: 5.89s	remaining: 4.11s
589:	learn: 0.3264334	total: 5.91s	remaining: 4.1s
590:	learn: 0.3263773	total: 5.91s	remaining: 4.09s
591:	learn: 0.3263285	total: 5.92s	remaining: 4.08s
592:	learn: 0.3262503	total: 5.93s	remaining: 4.07s
593:	learn: 0.3262015	total: 5.94s	remaining: 4.06s
594:	learn: 0.3259745	total: 5.95s	remaining: 4.05s
595:	learn: 0.3259196	total: 5.97s	remaining: 4.04s
596:	learn: 0.3258713	total: 5.98s	remaining: 4.03s
597:	learn: 0.3257984	total: 5.99s	remaining: 4.02s
598:	learn: 0.3257466	total: 6s	remaining: 4.01s
599:	learn: 0.3256967	total: 6.01s	remaining: 4s
600:	learn: 0.3256320	total: 6.02s	remaining: 3.99s
601:	learn: 0.3255872	total: 6.03s	remaining: 3.98s
602:	learn: 0.3255275	total: 6.04s	remaining: 3.97s
603:	learn: 0.3254813	total: 6.05s	remaining: 3.96s
604:	learn: 0.3253986	total: 6.06s	remaining: 3.96s
605:	learn: 0.3253306	total: 6.07s	remaining: 3.95s
606:	learn: 0.3252755	total: 6.08s	remaining: 3.94s
607:	learn: 0.3252447	total: 6.09s	remaining: 3.93s
608:	learn: 0.3251817	total: 6.1s	remaining: 3.92s
609:	learn: 0.3251085	total: 6.11s	remaining: 3.91s
610:	learn: 0.3250265	total: 6.12s	remaining: 3.9s
611:	learn: 0.3249671	total: 6.13s	remaining: 3.89s
612:	learn: 0.3248947	total: 6.14s	remaining: 3.88s
613:	learn: 0.3248347	total: 6.15s	remaining: 3.87s
614:	learn: 0.3247708	total: 6.16s	remaining: 3.86s
615:	learn: 0.3247521	total: 6.17s	remaining: 3.85s
616:	learn: 0.3247024	total: 6.18s	remaining: 3.84s
617:	learn: 0.3246269	total: 6.2s	remaining: 3.83s
618:	learn: 0.3245515	total: 6.21s	remaining: 3.82s
619:	learn: 0.3245095	total: 6.22s	remaining: 3.81s
620:	learn: 0.3244108	total: 6.23s	remaining: 3.8s
621:	learn: 0.3243770	total: 6.24s	remaining: 3.79s
622:	learn: 0.3242996	total: 6.25s	remaining: 3.78s
623:	learn: 0.3242183	total: 6.26s	remaining: 3.77s
624:	learn: 0.3241692	total: 6.27s	remaining: 3.76s
625:	learn: 0.3241248	total: 6.28s	remaining: 3.75s
626:	learn: 0.3240710	total: 6.29s	remaining: 3.74s
627:	learn: 0.3240308	total: 6.3s	remaining: 3.73s
628:	learn: 0.3239796	total: 6.31s	remaining: 3.72s
629:	learn: 0.3238963	total: 6.32s	remaining: 3.71s
630:	learn: 0.3238030	total: 6.33s	remaining: 3.7s
631:	learn: 0.3237226	total: 6.34s	remaining: 3.69s
632:	learn: 0.3236825	total: 6.35s	remaining: 3.68s
633:	learn: 0.3236004	total: 6.36s	remaining: 3.67s
634:	learn: 0.3235516	total: 6.37s	remaining: 3.66s
635:	learn: 0.3235085	total: 6.38s	remaining: 3.65s
636:	learn: 0.3234159	total: 6.39s	remaining: 3.64s
637:	learn: 0.3233759	total: 6.4s	remaining: 3.63s
638:	learn: 0.3233415	total: 6.41s	remaining: 3.62s
639:	learn: 0.3232653	total: 6.42s	remaining: 3.61s
640:	learn: 0.3232160	total: 6.43s	remaining: 3.6s
641:	learn: 0.3231510	total: 6.44s	remaining: 3.59s
642:	learn: 0.3231174	total: 6.45s	remaining: 3.58s
643:	learn: 0.3230507	total: 6.46s	remaining: 3.57s
644:	learn: 0.3229998	total: 6.47s	remaining: 3.56s
645:	learn: 0.3229319	total: 6.48s	remaining: 3.55s
646:	learn: 0.3228287	total: 6.49s	remaining: 3.54s
647:	learn: 0.3227809	total: 6.5s	remaining: 3.53s
648:	learn: 0.3227232	total: 6.51s	remaining: 3.52s
649:	learn: 0.3226608	total: 6.53s	remaining: 3.51s
650:	learn: 0.3225931	total: 6.54s	remaining: 3.5s
651:	learn: 0.3225028	total: 6.55s	remaining: 3.5s
652:	learn: 0.3224302	total: 6.56s	remaining: 3.48s
653:	learn: 0.3223523	total: 6.57s	remaining: 3.48s
654:	learn: 0.3223178	total: 6.58s	remaining: 3.47s
655:	learn: 0.3222530	total: 6.59s	remaining: 3.46s
656:	learn: 0.3221721	total: 6.6s	remaining: 3.45s
657:	learn: 0.3220951	total: 6.62s	remaining: 3.44s
658:	learn: 0.3220260	total: 6.63s	remaining: 3.43s
659:	learn: 0.3219827	total: 6.64s	remaining: 3.42s
660:	learn: 0.3218906	total: 6.65s	remaining: 3.41s
661:	learn: 0.3218629	total: 6.66s	remaining: 3.4s
662:	learn: 0.3217881	total: 6.67s	remaining: 3.39s
663:	learn: 0.3216859	total: 6.68s	remaining: 3.38s
664:	learn: 0.3216218	total: 6.69s	remaining: 3.37s
665:	learn: 0.3215292	total: 6.71s	remaining: 3.36s
666:	learn: 0.3214394	total: 6.72s	remaining: 3.35s
667:	learn: 0.3213486	total: 6.73s	remaining: 3.34s
668:	learn: 0.3213141	total: 6.74s	remaining: 3.33s
669:	learn: 0.3212410	total: 6.75s	remaining: 3.33s
670:	learn: 0.3211508	total: 6.76s	remaining: 3.31s
671:	learn: 0.3211214	total: 6.77s	remaining: 3.31s
672:	learn: 0.3210787	total: 6.78s	remaining: 3.29s
673:	learn: 0.3210038	total: 6.79s	remaining: 3.29s
674:	learn: 0.3209474	total: 6.8s	remaining: 3.28s
675:	learn: 0.3208977	total: 6.81s	remaining: 3.27s
676:	learn: 0.3208554	total: 6.82s	remaining: 3.25s
677:	learn: 0.3207988	total: 6.83s	remaining: 3.25s
678:	learn: 0.3207411	total: 6.84s	remaining: 3.23s
679:	learn: 0.3206982	total: 6.85s	remaining: 3.23s
680:	learn: 0.3206627	total: 6.86s	remaining: 3.21s
681:	learn: 0.3205764	total: 6.87s	remaining: 3.21s
682:	learn: 0.3205077	total: 6.88s	remaining: 3.19s
683:	learn: 0.3204630	total: 6.89s	remaining: 3.19s
684:	learn: 0.3204290	total: 6.91s	remaining: 3.17s
685:	learn: 0.3203906	total: 6.92s	remaining: 3.17s
686:	learn: 0.3203413	total: 6.93s	remaining: 3.15s
687:	learn: 0.3202997	total: 6.93s	remaining: 3.15s
688:	learn: 0.3202284	total: 6.95s	remaining: 3.13s
689:	learn: 0.3201773	total: 6.96s	remaining: 3.13s
690:	learn: 0.3201312	total: 6.97s	remaining: 3.12s
691:	learn: 0.3200860	total: 6.98s	remaining: 3.1s
692:	learn: 0.3200641	total: 6.99s	remaining: 3.09s
693:	learn: 0.3200254	total: 7s	remaining: 3.08s
694:	learn: 0.3199825	total: 7.01s	remaining: 3.07s
695:	learn: 0.3198533	total: 7.02s	remaining: 3.06s
696:	learn: 0.3197853	total: 7.03s	remaining: 3.06s
697:	learn: 0.3197140	total: 7.04s	remaining: 3.04s
698:	learn: 0.3196383	total: 7.05s	remaining: 3.03s
699:	learn: 0.3195766	total: 7.06s	remaining: 3.02s
700:	learn: 0.3195086	total: 7.07s	remaining: 3.02s
701:	learn: 0.3194717	total: 7.08s	remaining: 3.01s
702:	learn: 0.3193787	total: 7.09s	remaining: 3s
703:	learn: 0.3193263	total: 7.1s	remaining: 2.99s
704:	learn: 0.3192689	total: 7.11s	remaining: 2.98s
705:	learn: 0.3192121	total: 7.12s	remaining: 2.97s
706:	learn: 0.3191504	total: 7.13s	remaining: 2.96s
707:	learn: 0.3190482	total: 7.15s	remaining: 2.95s
708:	learn: 0.3189735	total: 7.16s	remaining: 2.94s
709:	learn: 0.3189295	total: 7.17s	remaining: 2.93s
710:	learn: 0.3188921	total: 7.18s	remaining: 2.92s
711:	learn: 0.3188313	total: 7.19s	remaining: 2.91s
712:	learn: 0.3187970	total: 7.2s	remaining: 2.9s
713:	learn: 0.3187415	total: 7.21s	remaining: 2.89s
714:	learn: 0.3186399	total: 7.22s	remaining: 2.88s
715:	learn: 0.3186105	total: 7.23s	remaining: 2.87s
716:	learn: 0.3185810	total: 7.24s	remaining: 2.86s
717:	learn: 0.3185438	total: 7.25s	remaining: 2.85s
718:	learn: 0.3184793	total: 7.26s	remaining: 2.84s
719:	learn: 0.3184090	total: 7.27s	remaining: 2.83s
720:	learn: 0.3183356	total: 7.28s	remaining: 2.82s
721:	learn: 0.3182718	total: 7.29s	remaining: 2.81s
722:	learn: 0.3182198	total: 7.3s	remaining: 2.8s
723:	learn: 0.3181419	total: 7.32s	remaining: 2.79s
724:	learn: 0.3180665	total: 7.33s	remaining: 2.78s
725:	learn: 0.3179963	total: 7.33s	remaining: 2.77s
726:	learn: 0.3179030	total: 7.35s	remaining: 2.76s
727:	learn: 0.3178410	total: 7.36s	remaining: 2.75s
728:	learn: 0.3177792	total: 7.37s	remaining: 2.74s
729:	learn: 0.3176852	total: 7.38s	remaining: 2.73s
730:	learn: 0.3176168	total: 7.39s	remaining: 2.72s
731:	learn: 0.3175719	total: 7.4s	remaining: 2.71s
732:	learn: 0.3175514	total: 7.41s	remaining: 2.7s
733:	learn: 0.3175089	total: 7.42s	remaining: 2.69s
734:	learn: 0.3174546	total: 7.43s	remaining: 2.68s
735:	learn: 0.3173953	total: 7.44s	remaining: 2.67s
736:	learn: 0.3173154	total: 7.45s	remaining: 2.66s
737:	learn: 0.3172674	total: 7.46s	remaining: 2.65s
738:	learn: 0.3172263	total: 7.47s	remaining: 2.64s
739:	learn: 0.3170689	total: 7.48s	remaining: 2.63s
740:	learn: 0.3169877	total: 7.49s	remaining: 2.62s
741:	learn: 0.3169220	total: 7.5s	remaining: 2.61s
742:	learn: 0.3168991	total: 7.51s	remaining: 2.6s
743:	learn: 0.3168154	total: 7.52s	remaining: 2.59s
744:	learn: 0.3167703	total: 7.54s	remaining: 2.58s
745:	learn: 0.3167486	total: 7.54s	remaining: 2.57s
746:	learn: 0.3167184	total: 7.55s	remaining: 2.56s
747:	learn: 0.3166092	total: 7.56s	remaining: 2.55s
748:	learn: 0.3165521	total: 7.57s	remaining: 2.54s
749:	learn: 0.3165063	total: 7.58s	remaining: 2.53s
750:	learn: 0.3164584	total: 7.59s	remaining: 2.52s
751:	learn: 0.3163647	total: 7.61s	remaining: 2.51s
752:	learn: 0.3162764	total: 7.62s	remaining: 2.5s
753:	learn: 0.3162222	total: 7.63s	remaining: 2.49s
754:	learn: 0.3161908	total: 7.64s	remaining: 2.48s
755:	learn: 0.3161632	total: 7.65s	remaining: 2.47s
756:	learn: 0.3161092	total: 7.66s	remaining: 2.46s
757:	learn: 0.3160704	total: 7.67s	remaining: 2.45s
758:	learn: 0.3159931	total: 7.68s	remaining: 2.44s
759:	learn: 0.3159454	total: 7.7s	remaining: 2.43s
760:	learn: 0.3159228	total: 7.71s	remaining: 2.42s
761:	learn: 0.3158775	total: 7.72s	remaining: 2.41s
762:	learn: 0.3158278	total: 7.73s	remaining: 2.4s
763:	learn: 0.3157569	total: 7.74s	remaining: 2.39s
764:	learn: 0.3157113	total: 7.75s	remaining: 2.38s
765:	learn: 0.3156312	total: 7.76s	remaining: 2.37s
766:	learn: 0.3155549	total: 7.77s	remaining: 2.36s
767:	learn: 0.3154751	total: 7.78s	remaining: 2.35s
768:	learn: 0.3154085	total: 7.79s	remaining: 2.34s
769:	learn: 0.3153520	total: 7.8s	remaining: 2.33s
770:	learn: 0.3152814	total: 7.81s	remaining: 2.32s
771:	learn: 0.3152459	total: 7.82s	remaining: 2.31s
772:	learn: 0.3151935	total: 7.83s	remaining: 2.3s
773:	learn: 0.3149086	total: 7.85s	remaining: 2.29s
774:	learn: 0.3148724	total: 7.86s	remaining: 2.28s
775:	learn: 0.3148219	total: 7.87s	remaining: 2.27s
776:	learn: 0.3147377	total: 7.88s	remaining: 2.26s
777:	learn: 0.3147151	total: 7.89s	remaining: 2.25s
778:	learn: 0.3146768	total: 7.9s	remaining: 2.24s
779:	learn: 0.3145838	total: 7.91s	remaining: 2.23s
780:	learn: 0.3145371	total: 7.92s	remaining: 2.22s
781:	learn: 0.3144340	total: 7.93s	remaining: 2.21s
782:	learn: 0.3144024	total: 7.94s	remaining: 2.2s
783:	learn: 0.3143143	total: 7.96s	remaining: 2.19s
784:	learn: 0.3142349	total: 7.97s	remaining: 2.18s
785:	learn: 0.3141923	total: 7.98s	remaining: 2.17s
786:	learn: 0.3141142	total: 7.99s	remaining: 2.16s
787:	learn: 0.3140546	total: 8s	remaining: 2.15s
788:	learn: 0.3139914	total: 8.01s	remaining: 2.14s
789:	learn: 0.3139409	total: 8.02s	remaining: 2.13s
790:	learn: 0.3138865	total: 8.03s	remaining: 2.12s
791:	learn: 0.3138220	total: 8.04s	remaining: 2.11s
792:	learn: 0.3137661	total: 8.05s	remaining: 2.1s
793:	learn: 0.3137225	total: 8.06s	remaining: 2.09s
794:	learn: 0.3136720	total: 8.07s	remaining: 2.08s
795:	learn: 0.3136213	total: 8.09s	remaining: 2.07s
796:	learn: 0.3135571	total: 8.1s	remaining: 2.06s
797:	learn: 0.3135011	total: 8.11s	remaining: 2.05s
798:	learn: 0.3134453	total: 8.12s	remaining: 2.04s
799:	learn: 0.3133741	total: 8.13s	remaining: 2.03s
800:	learn: 0.3133181	total: 8.14s	remaining: 2.02s
801:	learn: 0.3132588	total: 8.15s	remaining: 2.01s
802:	learn: 0.3131645	total: 8.16s	remaining: 2s
803:	learn: 0.3130917	total: 8.17s	remaining: 1.99s
804:	learn: 0.3130255	total: 8.18s	remaining: 1.98s
805:	learn: 0.3129218	total: 8.19s	remaining: 1.97s
806:	learn: 0.3128556	total: 8.2s	remaining: 1.96s
807:	learn: 0.3127946	total: 8.21s	remaining: 1.95s
808:	learn: 0.3126842	total: 8.23s	remaining: 1.94s
809:	learn: 0.3126223	total: 8.24s	remaining: 1.93s
810:	learn: 0.3125753	total: 8.25s	remaining: 1.92s
811:	learn: 0.3125097	total: 8.26s	remaining: 1.91s
812:	learn: 0.3124571	total: 8.27s	remaining: 1.9s
813:	learn: 0.3123832	total: 8.28s	remaining: 1.89s
814:	learn: 0.3123371	total: 8.29s	remaining: 1.88s
815:	learn: 0.3123042	total: 8.3s	remaining: 1.87s
816:	learn: 0.3122364	total: 8.31s	remaining: 1.86s
817:	learn: 0.3122129	total: 8.32s	remaining: 1.85s
818:	learn: 0.3121414	total: 8.33s	remaining: 1.84s
819:	learn: 0.3120868	total: 8.34s	remaining: 1.83s
820:	learn: 0.3120186	total: 8.35s	remaining: 1.82s
821:	learn: 0.3119787	total: 8.36s	remaining: 1.81s
822:	learn: 0.3119169	total: 8.37s	remaining: 1.8s
823:	learn: 0.3118506	total: 8.38s	remaining: 1.79s
824:	learn: 0.3117842	total: 8.39s	remaining: 1.78s
825:	learn: 0.3117424	total: 8.4s	remaining: 1.77s
826:	learn: 0.3116943	total: 8.41s	remaining: 1.76s
827:	learn: 0.3116494	total: 8.42s	remaining: 1.75s
828:	learn: 0.3115824	total: 8.43s	remaining: 1.74s
829:	learn: 0.3115482	total: 8.45s	remaining: 1.73s
830:	learn: 0.3114861	total: 8.46s	remaining: 1.72s
831:	learn: 0.3114407	total: 8.46s	remaining: 1.71s
832:	learn: 0.3113883	total: 8.47s	remaining: 1.7s
833:	learn: 0.3113608	total: 8.48s	remaining: 1.69s
834:	learn: 0.3112877	total: 8.5s	remaining: 1.68s
835:	learn: 0.3112274	total: 8.51s	remaining: 1.67s
836:	learn: 0.3111451	total: 8.52s	remaining: 1.66s
837:	learn: 0.3110753	total: 8.53s	remaining: 1.65s
838:	learn: 0.3110458	total: 8.54s	remaining: 1.64s
839:	learn: 0.3109625	total: 8.55s	remaining: 1.63s
840:	learn: 0.3109355	total: 8.56s	remaining: 1.62s
841:	learn: 0.3108354	total: 8.57s	remaining: 1.61s
842:	learn: 0.3108053	total: 8.58s	remaining: 1.6s
843:	learn: 0.3107403	total: 8.59s	remaining: 1.59s
844:	learn: 0.3106997	total: 8.6s	remaining: 1.58s
845:	learn: 0.3106597	total: 8.61s	remaining: 1.57s
846:	learn: 0.3105842	total: 8.62s	remaining: 1.56s
847:	learn: 0.3105270	total: 8.63s	remaining: 1.55s
848:	learn: 0.3104833	total: 8.64s	remaining: 1.54s
849:	learn: 0.3104327	total: 8.65s	remaining: 1.53s
850:	learn: 0.3103898	total: 8.66s	remaining: 1.52s
851:	learn: 0.3103489	total: 8.67s	remaining: 1.51s
852:	learn: 0.3102787	total: 8.68s	remaining: 1.5s
853:	learn: 0.3102117	total: 8.69s	remaining: 1.49s
854:	learn: 0.3101756	total: 8.7s	remaining: 1.48s
855:	learn: 0.3101117	total: 8.71s	remaining: 1.47s
856:	learn: 0.3100624	total: 8.72s	remaining: 1.46s
857:	learn: 0.3099904	total: 8.74s	remaining: 1.45s
858:	learn: 0.3099460	total: 8.75s	remaining: 1.44s
859:	learn: 0.3098964	total: 8.76s	remaining: 1.43s
860:	learn: 0.3098159	total: 8.77s	remaining: 1.42s
861:	learn: 0.3097614	total: 8.78s	remaining: 1.41s
862:	learn: 0.3097065	total: 8.79s	remaining: 1.4s
863:	learn: 0.3096625	total: 8.8s	remaining: 1.39s
864:	learn: 0.3095890	total: 8.81s	remaining: 1.38s
865:	learn: 0.3095246	total: 8.82s	remaining: 1.36s
866:	learn: 0.3094878	total: 8.84s	remaining: 1.35s
867:	learn: 0.3094541	total: 8.85s	remaining: 1.34s
868:	learn: 0.3093971	total: 8.86s	remaining: 1.33s
869:	learn: 0.3093159	total: 8.87s	remaining: 1.32s
870:	learn: 0.3092539	total: 8.88s	remaining: 1.31s
871:	learn: 0.3092002	total: 8.89s	remaining: 1.3s
872:	learn: 0.3091575	total: 8.9s	remaining: 1.29s
873:	learn: 0.3091203	total: 8.91s	remaining: 1.28s
874:	learn: 0.3090883	total: 8.92s	remaining: 1.27s
875:	learn: 0.3090488	total: 8.93s	remaining: 1.26s
876:	learn: 0.3089781	total: 8.94s	remaining: 1.25s
877:	learn: 0.3088917	total: 8.95s	remaining: 1.24s
878:	learn: 0.3088461	total: 8.96s	remaining: 1.23s
879:	learn: 0.3087920	total: 8.97s	remaining: 1.22s
880:	learn: 0.3087329	total: 8.98s	remaining: 1.21s
881:	learn: 0.3086974	total: 8.99s	remaining: 1.2s
882:	learn: 0.3086492	total: 9s	remaining: 1.19s
883:	learn: 0.3085688	total: 9.01s	remaining: 1.18s
884:	learn: 0.3084947	total: 9.03s	remaining: 1.17s
885:	learn: 0.3084352	total: 9.04s	remaining: 1.16s
886:	learn: 0.3083893	total: 9.05s	remaining: 1.15s
887:	learn: 0.3083344	total: 9.06s	remaining: 1.14s
888:	learn: 0.3082972	total: 9.07s	remaining: 1.13s
889:	learn: 0.3082618	total: 9.08s	remaining: 1.12s
890:	learn: 0.3081979	total: 9.09s	remaining: 1.11s
891:	learn: 0.3081391	total: 9.1s	remaining: 1.1s
892:	learn: 0.3080729	total: 9.11s	remaining: 1.09s
893:	learn: 0.3080325	total: 9.12s	remaining: 1.08s
894:	learn: 0.3079993	total: 9.13s	remaining: 1.07s
895:	learn: 0.3079430	total: 9.14s	remaining: 1.06s
896:	learn: 0.3078947	total: 9.15s	remaining: 1.05s
897:	learn: 0.3078345	total: 9.16s	remaining: 1.04s
898:	learn: 0.3077871	total: 9.17s	remaining: 1.03s
899:	learn: 0.3077353	total: 9.18s	remaining: 1.02s
900:	learn: 0.3076965	total: 9.19s	remaining: 1.01s
901:	learn: 0.3076432	total: 9.2s	remaining: 1s
902:	learn: 0.3075697	total: 9.21s	remaining: 990ms
903:	learn: 0.3075111	total: 9.23s	remaining: 980ms
904:	learn: 0.3074728	total: 9.24s	remaining: 970ms
905:	learn: 0.3074410	total: 9.25s	remaining: 960ms
906:	learn: 0.3073856	total: 9.26s	remaining: 949ms
907:	learn: 0.3073250	total: 9.27s	remaining: 939ms
908:	learn: 0.3072590	total: 9.28s	remaining: 929ms
909:	learn: 0.3072240	total: 9.29s	remaining: 919ms
910:	learn: 0.3071870	total: 9.3s	remaining: 909ms
911:	learn: 0.3071319	total: 9.31s	remaining: 899ms
912:	learn: 0.3070927	total: 9.32s	remaining: 889ms
913:	learn: 0.3070498	total: 9.33s	remaining: 878ms
914:	learn: 0.3070136	total: 9.35s	remaining: 868ms
915:	learn: 0.3069307	total: 9.36s	remaining: 858ms
916:	learn: 0.3069042	total: 9.37s	remaining: 848ms
917:	learn: 0.3068724	total: 9.38s	remaining: 838ms
918:	learn: 0.3067943	total: 9.39s	remaining: 827ms
919:	learn: 0.3067518	total: 9.4s	remaining: 817ms
920:	learn: 0.3066916	total: 9.41s	remaining: 807ms
921:	learn: 0.3066235	total: 9.42s	remaining: 797ms
922:	learn: 0.3065561	total: 9.43s	remaining: 787ms
923:	learn: 0.3064933	total: 9.44s	remaining: 777ms
924:	learn: 0.3064445	total: 9.45s	remaining: 767ms
925:	learn: 0.3064047	total: 9.46s	remaining: 756ms
926:	learn: 0.3063517	total: 9.47s	remaining: 746ms
927:	learn: 0.3063093	total: 9.49s	remaining: 736ms
928:	learn: 0.3062756	total: 9.5s	remaining: 726ms
929:	learn: 0.3062147	total: 9.51s	remaining: 716ms
930:	learn: 0.3061957	total: 9.52s	remaining: 705ms
931:	learn: 0.3061676	total: 9.53s	remaining: 695ms
932:	learn: 0.3061122	total: 9.54s	remaining: 685ms
933:	learn: 0.3060845	total: 9.55s	remaining: 675ms
934:	learn: 0.3060462	total: 9.56s	remaining: 664ms
935:	learn: 0.3059952	total: 9.57s	remaining: 654ms
936:	learn: 0.3058957	total: 9.58s	remaining: 644ms
937:	learn: 0.3058745	total: 9.59s	remaining: 634ms
938:	learn: 0.3058154	total: 9.6s	remaining: 624ms
939:	learn: 0.3057774	total: 9.61s	remaining: 613ms
940:	learn: 0.3057467	total: 9.62s	remaining: 603ms
941:	learn: 0.3057134	total: 9.63s	remaining: 593ms
942:	learn: 0.3056530	total: 9.64s	remaining: 583ms
943:	learn: 0.3055991	total: 9.65s	remaining: 573ms
944:	learn: 0.3055133	total: 9.66s	remaining: 562ms
945:	learn: 0.3054669	total: 9.67s	remaining: 552ms
946:	learn: 0.3054194	total: 9.69s	remaining: 542ms
947:	learn: 0.3053648	total: 9.7s	remaining: 532ms
948:	learn: 0.3053213	total: 9.71s	remaining: 522ms
949:	learn: 0.3052836	total: 9.72s	remaining: 511ms
950:	learn: 0.3052421	total: 9.73s	remaining: 501ms
951:	learn: 0.3051301	total: 9.74s	remaining: 491ms
952:	learn: 0.3051106	total: 9.75s	remaining: 481ms
953:	learn: 0.3050161	total: 9.76s	remaining: 471ms
954:	learn: 0.3049799	total: 9.77s	remaining: 460ms
955:	learn: 0.3049216	total: 9.78s	remaining: 450ms
956:	learn: 0.3048830	total: 9.79s	remaining: 440ms
957:	learn: 0.3048041	total: 9.8s	remaining: 430ms
958:	learn: 0.3047739	total: 9.81s	remaining: 420ms
959:	learn: 0.3047026	total: 9.83s	remaining: 409ms
960:	learn: 0.3046159	total: 9.84s	remaining: 399ms
961:	learn: 0.3045488	total: 9.85s	remaining: 389ms
962:	learn: 0.3045117	total: 9.86s	remaining: 379ms
963:	learn: 0.3044631	total: 9.87s	remaining: 369ms
964:	learn: 0.3044174	total: 9.88s	remaining: 358ms
965:	learn: 0.3043454	total: 9.89s	remaining: 348ms
966:	learn: 0.3042783	total: 9.9s	remaining: 338ms
967:	learn: 0.3042058	total: 9.91s	remaining: 328ms
968:	learn: 0.3041887	total: 9.93s	remaining: 318ms
969:	learn: 0.3041233	total: 9.94s	remaining: 307ms
970:	learn: 0.3040516	total: 9.95s	remaining: 297ms
971:	learn: 0.3040075	total: 9.96s	remaining: 287ms
972:	learn: 0.3039465	total: 9.97s	remaining: 277ms
973:	learn: 0.3038900	total: 9.98s	remaining: 266ms
974:	learn: 0.3038308	total: 9.99s	remaining: 256ms
975:	learn: 0.3038124	total: 10s	remaining: 246ms
976:	learn: 0.3037329	total: 10s	remaining: 236ms
977:	learn: 0.3036967	total: 10s	remaining: 225ms
978:	learn: 0.3036403	total: 10s	remaining: 215ms
979:	learn: 0.3035789	total: 10s	remaining: 205ms
980:	learn: 0.3034841	total: 10.1s	remaining: 195ms
981:	learn: 0.3034296	total: 10.1s	remaining: 185ms
982:	learn: 0.3033854	total: 10.1s	remaining: 174ms
983:	learn: 0.3033413	total: 10.1s	remaining: 164ms
984:	learn: 0.3033161	total: 10.1s	remaining: 154ms
985:	learn: 0.3032575	total: 10.1s	remaining: 144ms
986:	learn: 0.3032127	total: 10.1s	remaining: 133ms
987:	learn: 0.3031747	total: 10.1s	remaining: 123ms
988:	learn: 0.3031209	total: 10.1s	remaining: 113ms
989:	learn: 0.3030749	total: 10.2s	remaining: 103ms
990:	learn: 0.3030239	total: 10.2s	remaining: 92.4ms
991:	learn: 0.3029656	total: 10.2s	remaining: 82.1ms
992:	learn: 0.3029281	total: 10.2s	remaining: 71.8ms
993:	learn: 0.3028646	total: 10.2s	remaining: 61.6ms
994:	learn: 0.3027922	total: 10.2s	remaining: 51.3ms
995:	learn: 0.3027078	total: 10.2s	remaining: 41.1ms
996:	learn: 0.3026607	total: 10.2s	remaining: 30.8ms
997:	learn: 0.3025909	total: 10.2s	remaining: 20.5ms
998:	learn: 0.3025114	total: 10.3s	remaining: 10.3ms
999:	learn: 0.3024608	total: 10.3s	remaining: 0us
In [288]:
cat_finalscore = accuracy_score(y_test, y_pred)
In [289]:
cat_finalscore
Out[289]:
0.8547825289884963
In [290]:
r="Raw Data"
Results.append(r)
r={"Catboost":85.478252}
Results.append(r)
In [291]:
Results
Out[291]:
['df_na_mean_imp',
 {'Logistic Regression': 83.533381},
 {'xgboost': 85.24481},
 {'RandomForestClassifier': 84.833529},
 {'KNeighborsClassifier': 83.6153544},
 {'Gaussian Naive Bayes': 78.7896592},
 {'Gradient Boosting Classifier': 84.55542499},
 {'LightGBM': 85.217391304},
 {'Catboost': 85.5268311},
 'df_outliers__colna_rm',
 {'LogisticRegression': 84.5513},
 {'XGBoost': 86.0488437},
 {'RandomForestClassifier': 85.343912},
 {'knn': 84.5563},
 {'Gaussian Naive Bayes': 80.4034922},
 {'Gradient Boosting Classifier': 85.1669419},
 {'LightGBM': 85.818782},
 {'Catboost': 86.3201982},
 'df_na_rm',
 {'LogisticRegression': 84.7606032},
 {'xgboost': 86.336797},
 {'RandomForestClassifier': 86.10061},
 {'knn': 84.87246102},
 {'Gaussian Naive Bayes': 79.70595181},
 {'gradient Boosting Classifier': 85.8349078885},
 {'LightGBM': 86.4962210675},
 {'Catboost': 86.5906943},
 'Raw Data',
 {'Catboost': 85.478252}]

Finally Best model is catboost with the data set in which we just dropped the outliers¶

Catboost: 86.5906943¶

In [ ]: